Skip to content

Commit 25a6f28

Browse files
Add tracing guide for non-OpenAI LLMs in docs/tracing.md
1 parent 473e0a2 commit 25a6f28

File tree

1 file changed

+36
-2
lines changed

1 file changed

+36
-2
lines changed

docs/tracing.md

Lines changed: 36 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -39,7 +39,7 @@ By default, the SDK traces the following:
3939
- Audio outputs (text-to-speech) are wrapped in a `speech_span()`
4040
- Related audio spans may be parented under a `speech_group_span()`
4141

42-
By default, the trace is named "Agent workflow". You can set this name if you use `trace`, or you can configure the name and other properties with the [`RunConfig`][agents.run.RunConfig].
42+
By default, the trace is named "Agent trace". You can set this name if you use `trace`, or you can can configure the name and other properties with the [`RunConfig`][agents.run.RunConfig].
4343

4444
In addition, you can set up [custom trace processors](#custom-tracing-processors) to push traces to other destinations (as a replacement, or secondary destination).
4545

@@ -97,6 +97,41 @@ To customize this default setup, to send traces to alternative or additional bac
9797
1. [`add_trace_processor()`][agents.tracing.add_trace_processor] lets you add an **additional** trace processor that will receive traces and spans as they are ready. This lets you do your own processing in addition to sending traces to OpenAI's backend.
9898
2. [`set_trace_processors()`][agents.tracing.set_trace_processors] lets you **replace** the default processors with your own trace processors. This means traces will not be sent to the OpenAI backend unless you include a `TracingProcessor` that does so.
9999

100+
101+
## Tracing with Non-OpenAI LLMs
102+
103+
You can use an OpenAI API key with non-OpenAI LLMs to enable free tracing on the Openai Traces dashboard without disabling tracing.
104+
105+
```python
106+
from agents import set_tracing_export_api_key, Agent, Runner, AsyncOpenAI, OpenAIChatCompletionsModel
107+
from dotenv import load_dotenv
108+
load_dotenv()
109+
import os
110+
111+
set_tracing_export_api_key(os.getenv("OPENAI_API_KEY"))
112+
gemini_api_key = os.getenv("GEMINI_API_KEY")
113+
114+
external_client = AsyncOpenAI(
115+
api_key=gemini_api_key,
116+
base_url="https://generativelanguage.googleapis.com/v1beta"
117+
)
118+
119+
model = OpenAIChatCompletionsModel(
120+
model="gemini-2.0-flash",
121+
openai_client=external_client,
122+
)
123+
124+
agent = Agent(
125+
name="Assistant",
126+
model=model,
127+
)
128+
```
129+
130+
## Notes
131+
- Set OPENAI_API_KEY in a .env file.
132+
- View free traces at Openai Traces dashboard.
133+
134+
100135
## External tracing processors list
101136

102137
- [Weights & Biases](https://weave-docs.wandb.ai/guides/integrations/openai_agents)
@@ -117,4 +152,3 @@ To customize this default setup, to send traces to alternative or additional bac
117152
- [Okahu-Monocle](https://github.com/monocle2ai/monocle)
118153
- [Galileo](https://v2docs.galileo.ai/integrations/openai-agent-integration#openai-agent-integration)
119154
- [Portkey AI](https://portkey.ai/docs/integrations/agents/openai-agents)
120-
- [LangDB AI](https://docs.langdb.ai/getting-started/working-with-agent-frameworks/working-with-openai-agents-sdk)

0 commit comments

Comments
 (0)