Skip to content

Support for OpenAI Background Mode #3268

@mattbrandman

Description

@mattbrandman

Description

OpenAI has a background mode seen here that is especially important when doing deep research both due to how it updates and its long running nature making keeping an HTTP connection open for the duration unlikely. Supporting background mode in our backend means having to build out a separate non-instrumented (or manually instrumented) set of functions that interact directly with OpenAI removing some of the usefulness of PydanticAI being the integration point for everything else.

The nature of background mode does not really have any equivalence with other providers which I understand makes doing it at request time in modelsettings slightly complicated. Maybe a new part could be returned as part of a modelresponse indicating this is an async request and then for each model (only openai to start it could handle what to do with that part on response). Alternatively that part could be given to a new function that returns the current state of that message, I realize this would be more interruptive as all models would need to implement.

References

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions