-
Notifications
You must be signed in to change notification settings - Fork 1.5k
Closed as not planned
Labels
Description
Description
I would love to have the option to stream structured responses back to the caller as chunks of delta updates of text. Currently this isn't possible.
- I can stream text as delta text chunks, but only if the output format is string
- I can stream updated snapshots of accumulated json objects
Neither of those is what I want. The OpenAI responses api does allow streaming text with a predefined data model. Is there a reason for checking if the output type is a string when using stream_text?
Why shouldn't this be possible:
class FullName(BaseModel):
first_name: str
last_name: str
agent = Agent(model="openai:gpt-5", output_type=FullName)
async with agent.run_stream("Generate a random name") as result:
async for chunk in result.stream_text(delta=True):
yield chunk
# {'first_n
# ame': 'Joh
# n', 'last
# _name': 'Smith'}References
No response