-
Notifications
You must be signed in to change notification settings - Fork 1.5k
Closed
Labels
bugSomething isn't workingSomething isn't working
Description
Initial Checks
- I confirm that I'm using the latest version of Pydantic AI
- I confirm that I searched for my issue in https://github.com/pydantic/pydantic-ai/issues before opening this issue
Description
Hey, and thank you for a wonderful framework!
Using the example code attached, I expect gpt-oss to handle the (pretty easy) tool calling, however the tool is never called. Tool calling seems to work fine using the qwen3 model.
Also, this seems to be a problem with run_stream but not run which works as expected.
gpt-oss outputs the following:
["[Request] Starting part 0: TextPart(content='')",
'[Result] The model starting producing a final result (tool_name=None)']
Whereas qwen3 does the whole thing:
["[Request] Starting part 0: ThinkingPart(content='', id='content', "
"provider_name='ollama')",
"[Request] Starting part 1: ToolCallPart(tool_name='get_current_datetime', "
"args='{}', tool_call_id='call_39tumgp8')",
"[Tools] The LLM calls tool='get_current_datetime' with args={} "
"(tool_call_id='call_39tumgp8')",
"[Tools] Tool call 'call_39tumgp8' returned => "
'2025-09-29T15:17:28.423696+00:00',
"[Request] Starting part 0: ThinkingPart(content='', id='content', "
"provider_name='ollama')",
"[Request] Starting part 1: TextPart(content='The')",
'[Result] The model starting producing a final result (tool_name=None)',
'[Output] The current time is',
'[Output] The current time is **3:1',
'[Output] The current time is **3:17 PM on September',
'[Output] The current time is **3:17 PM on September 29,',
'[Output] The current time is **3:17 PM on September 29, 202',
'[Output] The current time is **3:17 PM on September 29, 2025 in UTC**',
'[Output] The current time is **3:17 PM on September 29, 2025 in UTC**. Let '
'me know',
'[Output] The current time is **3:17 PM on September 29, 2025 in UTC**. Let '
'me know if you need the',
'[Output] The current time is **3:17 PM on September 29, 2025 in UTC**. Let '
'me know if you need the time in a different',
'[Output] The current time is **3:17 PM on September 29, 2025 in UTC**. Let '
'me know if you need the time in a different timezone!']
Example Code
import asyncio
import datetime
from collections.abc import AsyncIterable
from pydantic_ai import Agent
from pydantic_ai import RunContext
from pydantic_ai.messages import AgentStreamEvent
from pydantic_ai.messages import FinalResultEvent
from pydantic_ai.messages import FunctionToolCallEvent
from pydantic_ai.messages import FunctionToolResultEvent
from pydantic_ai.messages import PartDeltaEvent
from pydantic_ai.messages import PartStartEvent
from pydantic_ai.messages import TextPartDelta
from pydantic_ai.messages import ToolCallPartDelta
from pydantic_ai.models.openai import OpenAIChatModel
from pydantic_ai.providers.ollama import OllamaProvider
def get_current_datetime():
"""Returns the current datetime in ISO format"""
return datetime.datetime.now(datetime.UTC).isoformat()
async def event_stream_handler(
ctx: RunContext,
event_stream: AsyncIterable[AgentStreamEvent],
):
async for event in event_stream:
if isinstance(event, PartStartEvent):
output_messages.append(
f"[Request] Starting part {event.index}: {event.part!r}"
)
elif isinstance(event, PartDeltaEvent):
if isinstance(event.delta, TextPartDelta):
output_messages.append(
f"[Request] Part {event.index} text delta: {event.delta.content_delta!r}"
)
elif isinstance(event.delta, ToolCallPartDelta):
output_messages.append(
f"[Request] Part {event.index} args delta: {event.delta.args_delta}"
)
elif isinstance(event, FunctionToolCallEvent):
output_messages.append(
f"[Tools] The LLM calls tool={event.part.tool_name!r} with args={event.part.args} (tool_call_id={event.part.tool_call_id!r})"
)
elif isinstance(event, FunctionToolResultEvent):
output_messages.append(
f"[Tools] Tool call {event.tool_call_id!r} returned => {event.result.content}"
)
elif isinstance(event, FinalResultEvent):
output_messages.append(
f"[Result] The model starting producing a final result (tool_name={event.tool_name})"
)
output_messages: list[str] = []
async def test():
model = OpenAIChatModel(
provider=OllamaProvider(
base_url="http://localhost:11434/v1", api_key="dummy"
),
model_name="gpt-oss",
)
agent = Agent(model=model, tools=[get_current_datetime])
async with agent.run_stream(
"what time is it?", event_stream_handler=event_stream_handler
) as res:
async for message in res.stream_text():
output_messages.append(f"[Output] {message}")
if __name__ == "__main__":
asyncio.run(test())
from pprint import pprint
pprint(output_messages)Python, Pydantic AI & LLM client version
python 3.12
pydantic-ai 1.0.10
Ollama 0.12.3
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working