Skip to content

Commit ab6a16a

Browse files
committed
fix #20333 when store=False
1 parent 0a09291 commit ab6a16a

File tree

2 files changed

+2
-17
lines changed

2 files changed

+2
-17
lines changed

llama-index-integrations/llms/llama-index-llms-openai/tests/test_openai_responses.py

Lines changed: 1 addition & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -864,23 +864,8 @@ def test_messages_to_openai_responses_messages_with_store():
864864
),
865865
]
866866

867-
kwargs = {
868-
"model": "fake-model",
869-
"include": None,
870-
"instructions": None,
871-
"max_output_tokens": 100,
872-
"metadata": {},
873-
"previous_response_id": None,
874-
"store": True,
875-
"temperature": 0.0,
876-
"tools": [],
877-
"top_p": 1.0,
878-
"truncation": None,
879-
"user": None,
880-
}
881-
882867
openai_messages = to_openai_message_dicts(
883-
messages, is_responses_api=True, kwargs=kwargs
868+
messages, is_responses_api=True, store=True
884869
)
885870
assert len(openai_messages) == 8
886871
assert openai_messages[0]["role"] == "developer"

llama-index-integrations/llms/llama-index-llms-openai/uv.lock

Lines changed: 1 addition & 1 deletion
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

0 commit comments

Comments
 (0)