-
Notifications
You must be signed in to change notification settings - Fork 2.7k
Add tracing guide for non-OpenAI LLMs in docs/tracing.md #1329
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from 1 commit
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -39,7 +39,7 @@ By default, the SDK traces the following: | |
- Audio outputs (text-to-speech) are wrapped in a `speech_span()` | ||
- Related audio spans may be parented under a `speech_group_span()` | ||
|
||
By default, the trace is named "Agent workflow". You can set this name if you use `trace`, or you can configure the name and other properties with the [`RunConfig`][agents.run.RunConfig]. | ||
By default, the trace is named "Agent trace". You can set this name if you use `trace`, or you can can configure the name and other properties with the [`RunConfig`][agents.run.RunConfig]. | ||
|
||
In addition, you can set up [custom trace processors](#custom-tracing-processors) to push traces to other destinations (as a replacement, or secondary destination). | ||
|
||
|
@@ -97,6 +97,41 @@ To customize this default setup, to send traces to alternative or additional bac | |
1. [`add_trace_processor()`][agents.tracing.add_trace_processor] lets you add an **additional** trace processor that will receive traces and spans as they are ready. This lets you do your own processing in addition to sending traces to OpenAI's backend. | ||
2. [`set_trace_processors()`][agents.tracing.set_trace_processors] lets you **replace** the default processors with your own trace processors. This means traces will not be sent to the OpenAI backend unless you include a `TracingProcessor` that does so. | ||
|
||
|
||
## Tracing with Non-OpenAI LLMs | ||
|
||
You can use an OpenAI API key with non-OpenAI LLMs to enable free tracing on the Openai Traces dashboard without disabling tracing. | ||
|
||
```python | ||
from agents import set_tracing_export_api_key, Agent, Runner, AsyncOpenAI, OpenAIChatCompletionsModel | ||
from dotenv import load_dotenv | ||
load_dotenv() | ||
import os | ||
|
||
set_tracing_export_api_key(os.getenv("OPENAI_API_KEY")) | ||
gemini_api_key = os.getenv("GEMINI_API_KEY") | ||
|
||
external_client = AsyncOpenAI( | ||
|
||
api_key=gemini_api_key, | ||
base_url="https://generativelanguage.googleapis.com/v1beta" | ||
) | ||
|
||
model = OpenAIChatCompletionsModel( | ||
model="gemini-2.0-flash", | ||
openai_client=external_client, | ||
) | ||
|
||
agent = Agent( | ||
name="Assistant", | ||
model=model, | ||
) | ||
``` | ||
|
||
## Notes | ||
- Set OPENAI_API_KEY in a .env file. | ||
- View free traces at Openai Traces dashboard. | ||
|
||
|
||
## External tracing processors list | ||
|
||
- [Weights & Biases](https://weave-docs.wandb.ai/guides/integrations/openai_agents) | ||
|
@@ -117,4 +152,3 @@ To customize this default setup, to send traces to alternative or additional bac | |
- [Okahu-Monocle](https://github.com/monocle2ai/monocle) | ||
- [Galileo](https://v2docs.galileo.ai/integrations/openai-agent-integration#openai-agent-integration) | ||
- [Portkey AI](https://portkey.ai/docs/integrations/agents/openai-agents) | ||
- [LangDB AI](https://docs.langdb.ai/getting-started/working-with-agent-frameworks/working-with-openai-agents-sdk) | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. please revert this change |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"Agent workflow" is the correct default name.