-
Notifications
You must be signed in to change notification settings - Fork 5
Description
Is your feature request related to a problem? Please describe.
The OpenAI platform just released an alternative to the completions endpoint called responses
https://platform.openai.com/docs/api-reference/responses
They are now using this by default in the playground (which generates content) as well as in their libraries like the OpenAI Agents SDK.
Soon, users will complain as they've been pushed into an endpoint that we don't have trace support for.
Describe the solution you'd like
Logs, metrics and tracing for responses like we do for completions.
TODO feature checkboxes
Describe alternatives you've considered
Tell people to not use responses
Additional context
announcement: https://community.openai.com/t/introducing-the-responses-api/1140929
local inference support of core features
- litellm https://github.com/BerriAI/litellm/releases/tag/v1.67.0-stable
- llama-stack https://github.com/meta-llama/llama-stack/releases/tag/v0.2.4
- masaic-ai https://github.com/masaic-ai-platform/open-responses/releases/tag/v0.1.3-M2
- ollama Support for OpenAI Responses API (for Codex CLI compatibility) ollama/ollama#10309
- vllm [Feature]: Support openai responses API interface vllm-project/vllm#14721
other python SDKs
- langtrace https://github.com/Scale3-Labs/langtrace-python-sdk/releases/tag/3.8.13
- logfire https://github.com/pydantic/logfire/releases/tag/v3.8.0
- openinference https://github.com/Arize-ai/openinference/releases/tag/python-openinference-instrumentation-openai-v0.1.26
- openlit https://github.com/openlit/openlit/releases/tag/py-1.33.16
- openllmetry Feature: Instrument OpenAI's new Responses API #2782 traceloop/openllmetry#2787