-
Notifications
You must be signed in to change notification settings - Fork 1.5k
Closed
Labels
Description
Initial Checks
- I confirm that I'm using the latest version of Pydantic AI
- I confirm that I searched for my issue in https://github.com/pydantic/pydantic-ai/issues before opening this issue
Description
I’m experiencing noisy output when trying to run the example code from the documentation (run_agent.py) https://ai.pydantic.dev/agents/#streaming-events-and-final-output
ChatCompletionChunk(id='chatcmpl-CMBLqvqU0H4VEslaLbTGFqpoury4s', choices=[Choice(delta=ChoiceDelta(content='', function_call=None, refusal=None, role='assistant', tool_calls=None), finish_reason=None, index=0, logprobs=None)], created=1759402886, model='gpt-4o-2024-08-06', object='chat.completion.chunk', service_tier='default', system_fingerprint='fp_f33640a400', usage=None, obfuscation='ydtqNib5xg6Wup')
ChatCompletionChunk(id='chatcmpl-CMBLqvqU0H4VEslaLbTGFqpoury4s', choices=[Choice(delta=ChoiceDelta(content='The', function_call=None, refusal=None, role=None, tool_calls=None), finish_reason=None, index=0, logprobs=None)], created=1759402886, model='gpt-4o-2024-08-06', object='chat.completion.chunk', service_tier='default', system_fingerprint='fp_f33640a400', usage=None, obfuscation='3A3AmAzek5TUw')
ChatCompletionChunk(id='chatcmpl-CMBLqvqU0H4VEslaLbTGFqpoury4s', choices=[Choice(delta=ChoiceDelta(content=' capital', function_call=None, refusal=None, role=None, tool_calls=None), finish_reason=None, index=0, logprobs=None)], created=1759402886, model='gpt-4o-2024-08-06', object='chat.completion.chunk', service_tier='default', system_fingerprint='fp_f33640a400', usage=None, obfuscation='4drHGlhc')
ChatCompletionChunk(id='chatcmpl-CMBLqvqU0H4VEslaLbTGFqpoury4s', choices=[Choice(delta=ChoiceDelta(content=' of', function_call=None, refusal=None, role=None, tool_calls=None), finish_reason=None, index=0, logprobs=None)], created=1759402886, model='gpt-4o-2024-08-06', object='chat.completion.chunk', service_tier='default', system_fingerprint='fp_f33640a400', usage=None, obfuscation='KgDPRrYATMVUU')
ChatCompletionChunk(id='chatcmpl-CMBLqvqU0H4VEslaLbTGFqpoury4s', choices=[Choice(delta=ChoiceDelta(content=' the', function_call=None, refusal=None, role=None, tool_calls=None), finish_reason=None, index=0, logprobs=None)], created=1759402886, model='gpt-4o-2024-08-06', object='chat.completion.chunk', service_tier='default', system_fingerprint='fp_f33640a400', usage=None, obfuscation='xfUpGFTvHvXP')
ChatCompletionChunk(id='chatcmpl-CMBLqvqU0H4VEslaLbTGFqpoury4s', choices=[Choice(delta=ChoiceDelta(content=' United', function_call=None, refusal=None, role=None, tool_calls=None), finish_reason=None, index=0, logprobs=None)], created=1759402886, model='gpt-4o-2024-08-06', object='chat.completion.chunk', service_tier='default', system_fingerprint='fp_f33640a400', usage=None, obfuscation='h6Nedo2iv')
ChatCompletionChunk(id='chatcmpl-CMBLqvqU0H4VEslaLbTGFqpoury4s', choices=[Choice(delta=ChoiceDelta(content=' Kingdom', function_call=None, refusal=None, role=None, tool_calls=None), finish_reason=None, index=0, logprobs=None)], created=1759402886, model='gpt-4o-2024-08-06', object='chat.completion.chunk', service_tier='default', system_fingerprint='fp_f33640a400', usage=None, obfuscation='HtH60JZt')
ChatCompletionChunk(id='chatcmpl-CMBLqvqU0H4VEslaLbTGFqpoury4s', choices=[Choice(delta=ChoiceDelta(content=' is', function_call=None, refusal=None, role=None, tool_calls=None), finish_reason=None, index=0, logprobs=None)], created=1759402886, model='gpt-4o-2024-08-06', object='chat.completion.chunk', service_tier='default', system_fingerprint='fp_f33640a400', usage=None, obfuscation='UOQoSlPZDSOxM')
ChatCompletionChunk(id='chatcmpl-CMBLqvqU0H4VEslaLbTGFqpoury4s', choices=[Choice(delta=ChoiceDelta(content=' London', function_call=None, refusal=None, role=None, tool_calls=None), finish_reason=None, index=0, logprobs=None)], created=1759402886, model='gpt-4o-2024-08-06', object='chat.completion.chunk', service_tier='default', system_fingerprint='fp_f33640a400', usage=None, obfuscation='ZLHuVfJu7')
ChatCompletionChunk(id='chatcmpl-CMBLqvqU0H4VEslaLbTGFqpoury4s', choices=[Choice(delta=ChoiceDelta(content='.', function_call=None, refusal=None, role=None, tool_calls=None), finish_reason=None, index=0, logprobs=None)], created=1759402886, model='gpt-4o-2024-08-06', object='chat.completion.chunk', service_tier='default', system_fingerprint='fp_f33640a400', usage=None, obfuscation='GLlg6BW4p0K7puu')
ChatCompletionChunk(id='chatcmpl-CMBLqvqU0H4VEslaLbTGFqpoury4s', choices=[Choice(delta=ChoiceDelta(content=None, function_call=None, refusal=None, role=None, tool_calls=None), finish_reason='stop', index=0, logprobs=None)], created=1759402886, model='gpt-4o-2024-08-06', object='chat.completion.chunk', service_tier='default', system_fingerprint='fp_f33640a400', usage=None, obfuscation='hEDS1WuTES')
ChatCompletionChunk(id='chatcmpl-CMBLqvqU0H4VEslaLbTGFqpoury4s', choices=[], created=1759402886, model='gpt-4o-2024-08-06', object='chat.completion.chunk', service_tier='default', system_fingerprint='fp_f33640a400', usage=CompletionUsage(completion_tokens=9, prompt_tokens=15, total_tokens=24, completion_tokens_details=CompletionTokensDetails(accepted_prediction_tokens=0, audio_tokens=0, reasoning_tokens=0, rejected_prediction_tokens=0), prompt_tokens_details=PromptTokensDetails(audio_tokens=0, cached_tokens=0)), obfuscation='')
Example Code
import asyncio
from pydantic_ai import Agent
agent = Agent('openai:gpt-4o')
result_sync = agent.run_sync('What is the capital of Italy?')
print(result_sync.output)
#> The capital of Italy is Rome.
async def main():
result = await agent.run('What is the capital of France?')
print(result.output)
#> The capital of France is Paris.
async with agent.run_stream('What is the capital of the UK?') as response:
async for text in response.stream_text():
print(text)
#> The capital of
#> The capital of the UK is
#> The capital of the UK is London.
if __name__ == '__main__':
asyncio.run(main())Python, Pydantic AI & LLM client version
python = 3.12
openai>=1.78.1
uv>=0.8.22
pydantic-ai==1.0.13