Skip to content
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
38 changes: 36 additions & 2 deletions docs/tracing.md
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ By default, the SDK traces the following:
- Audio outputs (text-to-speech) are wrapped in a `speech_span()`
- Related audio spans may be parented under a `speech_group_span()`

By default, the trace is named "Agent workflow". You can set this name if you use `trace`, or you can configure the name and other properties with the [`RunConfig`][agents.run.RunConfig].
By default, the trace is named "Agent trace". You can set this name if you use `trace`, or you can can configure the name and other properties with the [`RunConfig`][agents.run.RunConfig].
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"Agent workflow" is the correct default name.

Suggested change
By default, the trace is named "Agent trace". You can set this name if you use `trace`, or you can can configure the name and other properties with the [`RunConfig`][agents.run.RunConfig].
By default, the trace is named "Agent workflow". You can set this name if you use `trace`, or you can can configure the name and other properties with the [`RunConfig`][agents.run.RunConfig].


In addition, you can set up [custom trace processors](#custom-tracing-processors) to push traces to other destinations (as a replacement, or secondary destination).

Expand Down Expand Up @@ -97,6 +97,41 @@ To customize this default setup, to send traces to alternative or additional bac
1. [`add_trace_processor()`][agents.tracing.add_trace_processor] lets you add an **additional** trace processor that will receive traces and spans as they are ready. This lets you do your own processing in addition to sending traces to OpenAI's backend.
2. [`set_trace_processors()`][agents.tracing.set_trace_processors] lets you **replace** the default processors with your own trace processors. This means traces will not be sent to the OpenAI backend unless you include a `TracingProcessor` that does so.


## Tracing with Non-OpenAI LLMs

You can use an OpenAI API key with non-OpenAI LLMs to enable free tracing on the Openai Traces dashboard without disabling tracing.

```python
from agents import set_tracing_export_api_key, Agent, Runner, AsyncOpenAI, OpenAIChatCompletionsModel
from dotenv import load_dotenv
load_dotenv()
import os

set_tracing_export_api_key(os.getenv("OPENAI_API_KEY"))
gemini_api_key = os.getenv("GEMINI_API_KEY")

external_client = AsyncOpenAI(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The compatible API layer provided by other platforms are not perfectly compatible. So, we'd like to avoid recommending those endpoints. Instead, can you use code examples using LiteLLM + Chat Completions? Also, rather than mentioning a specific model / endpoint, we would like to simply show how to set up with a non-OpenAI model in general.

See also: https://openai.github.io/openai-agents-python/models/litellm/

model = "your model name here"
api_key = "your api key here"
model=LitellmModel(model=model, api_key=api_key),

api_key=gemini_api_key,
base_url="https://generativelanguage.googleapis.com/v1beta"
)

model = OpenAIChatCompletionsModel(
model="gemini-2.0-flash",
openai_client=external_client,
)

agent = Agent(
name="Assistant",
model=model,
)
```

## Notes
- Set OPENAI_API_KEY in a .env file.
- View free traces at Openai Traces dashboard.


## External tracing processors list

- [Weights & Biases](https://weave-docs.wandb.ai/guides/integrations/openai_agents)
Expand All @@ -117,4 +152,3 @@ To customize this default setup, to send traces to alternative or additional bac
- [Okahu-Monocle](https://github.com/monocle2ai/monocle)
- [Galileo](https://v2docs.galileo.ai/integrations/openai-agent-integration#openai-agent-integration)
- [Portkey AI](https://portkey.ai/docs/integrations/agents/openai-agents)
- [LangDB AI](https://docs.langdb.ai/getting-started/working-with-agent-frameworks/working-with-openai-agents-sdk)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please revert this change