-
Notifications
You must be signed in to change notification settings - Fork 1.5k
Closed
Labels
Description
Initial Checks
- I confirm that I'm using the latest version of Pydantic AI
- I confirm that I searched for my issue in https://github.com/pydantic/pydantic-ai/issues before opening this issue
Description
I'm using pydantic 0.6.2 and after upgrading to the latest openai (greater than 1.99.1) I started getting the following error:
File "/usr/local/lib/python3.13/site-packages/pydantic_ai/models/openai.py", line 315, in _completions_create
openai_messages = await self._map_messages(messages)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.13/site-packages/pydantic_ai/models/openai.py", line 486, in _map_messages
tool_calls.append(self._map_tool_call(item))
~~~~~~~~~~~~~~~~~~~^^^^^^
File "/usr/local/lib/python3.13/site-packages/pydantic_ai/models/openai.py", line 508, in _map_tool_call
return chat.ChatCompletionMessageToolCallParam(
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^
id=_guard_tool_call_id(t=t),
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
type='function',
^^^^^^^^^^^^^^^^
function={'name': t.tool_name, 'arguments': t.args_as_json_str()},
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
)
^
File "/usr/local/lib/python3.13/typing.py", line 1317, in __call__
result = self.__origin__(*args, **kwargs)
File "/usr/local/lib/python3.13/typing.py", line 560, in __call__
raise TypeError(f"Cannot instantiate {self!r}")
TypeError: Cannot instantiate typing.Union
My code is similar to the one from here: https://ai.pydantic.dev/message-history/#storing-and-loading-messages-to-json - I'm simply saving and restoring messages history.
What happens:
- The first message I send to my PydanticAI agent is processed correctly
- The response to the above call is a "final_result" tool call
- the next message I send to the agent, while using messages_history that contains the final_result tool call fails with the above error.
Sample messages:
[
{
"kind": "request",
"parts": [
{
"content": "hi there",
"part_kind": "user-prompt",
"timestamp": "2025-08-11T11:20:22.522615Z"
}
],
"instructions": "..."
},
{
"kind": "response",
"parts": [
{
"args": "{\"message\":\"Hello!\"}",
"part_kind": "tool-call",
"tool_name": "final_result",
"tool_call_id": "call_VFqZ3Dm1uhCbOIJmHXaL0BCL"
}
],
"usage": {
"details": {
"audio_tokens": 0,
"cached_tokens": 0,
"reasoning_tokens": 0,
"accepted_prediction_tokens": 0,
"rejected_prediction_tokens": 0
},
"requests": 1,
"total_tokens": 820,
"request_tokens": 805,
"response_tokens": 15
},
"timestamp": "2025-08-11T11:20:22Z",
"vendor_id": "chatcmpl-C3KreKto8WewObphvxayf2AukUu6q",
"model_name": "gpt-4.1-mini-2025-04-14",
"vendor_details": null
},
{
"kind": "request",
"parts": [
{
"content": "Final result processed.",
"metadata": null,
"part_kind": "tool-return",
"timestamp": "2025-08-11T11:20:23.254464Z",
"tool_name": "final_result",
"tool_call_id": "call_VFqZ3Dm1uhCbOIJmHXaL0BCL"
}
],
"instructions": null
},
Now, when I use this as message_history for the next message the aforementioned exception is raised.
I tracked this down to the 1.99.2 version of Open AI where this change seems suspicious to me: openai/openai-python@0c57bd7
Python, Pydantic AI & LLM client version
pydantic AI: 0.6.2
lpb-conseil and wiseyoungbuck