Skip to content

Commit 8e7e0dd

Browse files
authored
fix(responses): use conversation items when no stored messages exist (llamastack#3819)
Handle a base case when no stored messages exist because no Response call has been made. ## Test Plan ``` ./scripts/integration-tests.sh --stack-config server:ci-tests \ --suite responses --inference-mode record-if-missing --pattern test_conversation_responses ```
1 parent 6ba9db3 commit 8e7e0dd

File tree

4 files changed

+1894
-10
lines changed

4 files changed

+1894
-10
lines changed

llama_stack/providers/inline/agents/meta_reference/responses/openai_responses.py

Lines changed: 14 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -136,9 +136,21 @@ async def _process_input_with_previous_response(
136136
# First turn - just convert the new input
137137
messages = await convert_response_input_to_chat_messages(input)
138138
else:
139-
# Use stored messages directly and convert only new input
139+
if not stored_messages:
140+
all_input = conversation_items.data
141+
if isinstance(input, str):
142+
all_input.append(
143+
OpenAIResponseMessage(
144+
role="user", content=[OpenAIResponseInputMessageContentText(text=input)]
145+
)
146+
)
147+
else:
148+
all_input.extend(input)
149+
else:
150+
all_input = input
151+
140152
messages = stored_messages or []
141-
new_messages = await convert_response_input_to_chat_messages(input, previous_messages=messages)
153+
new_messages = await convert_response_input_to_chat_messages(all_input, previous_messages=messages)
142154
messages.extend(new_messages)
143155
else:
144156
all_input = input

0 commit comments

Comments
 (0)