Skip to content

Enable: output_type + stream_text #2707

@pietz

Description

@pietz

Description

I would love to have the option to stream structured responses back to the caller as chunks of delta updates of text. Currently this isn't possible.

  • I can stream text as delta text chunks, but only if the output format is string
  • I can stream updated snapshots of accumulated json objects

Neither of those is what I want. The OpenAI responses api does allow streaming text with a predefined data model. Is there a reason for checking if the output type is a string when using stream_text?

Why shouldn't this be possible:

class FullName(BaseModel):
    first_name: str
    last_name: str

agent = Agent(model="openai:gpt-5", output_type=FullName)

async with agent.run_stream("Generate a random name") as result:
    async for chunk in result.stream_text(delta=True):
        yield chunk

# {'first_n
# ame': 'Joh
# n', 'last
# _name': 'Smith'}

References

No response

Metadata

Metadata

Assignees

Labels

StalequestionFurther information is requested

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions