From 25a6f2871601dde0aed510595a43f8302b0c432c Mon Sep 17 00:00:00 2001 From: UmmeHabiba1312 Date: Thu, 31 Jul 2025 18:52:02 +0500 Subject: [PATCH 1/3] Add tracing guide for non-OpenAI LLMs in docs/tracing.md --- docs/tracing.md | 38 ++++++++++++++++++++++++++++++++++++-- 1 file changed, 36 insertions(+), 2 deletions(-) diff --git a/docs/tracing.md b/docs/tracing.md index 5182e159f..cf11afadf 100644 --- a/docs/tracing.md +++ b/docs/tracing.md @@ -39,7 +39,7 @@ By default, the SDK traces the following: - Audio outputs (text-to-speech) are wrapped in a `speech_span()` - Related audio spans may be parented under a `speech_group_span()` -By default, the trace is named "Agent workflow". You can set this name if you use `trace`, or you can configure the name and other properties with the [`RunConfig`][agents.run.RunConfig]. +By default, the trace is named "Agent trace". You can set this name if you use `trace`, or you can can configure the name and other properties with the [`RunConfig`][agents.run.RunConfig]. In addition, you can set up [custom trace processors](#custom-tracing-processors) to push traces to other destinations (as a replacement, or secondary destination). @@ -97,6 +97,41 @@ To customize this default setup, to send traces to alternative or additional bac 1. [`add_trace_processor()`][agents.tracing.add_trace_processor] lets you add an **additional** trace processor that will receive traces and spans as they are ready. This lets you do your own processing in addition to sending traces to OpenAI's backend. 2. [`set_trace_processors()`][agents.tracing.set_trace_processors] lets you **replace** the default processors with your own trace processors. This means traces will not be sent to the OpenAI backend unless you include a `TracingProcessor` that does so. + +## Tracing with Non-OpenAI LLMs + +You can use an OpenAI API key with non-OpenAI LLMs to enable free tracing on the Openai Traces dashboard without disabling tracing. + +```python +from agents import set_tracing_export_api_key, Agent, Runner, AsyncOpenAI, OpenAIChatCompletionsModel +from dotenv import load_dotenv +load_dotenv() +import os + +set_tracing_export_api_key(os.getenv("OPENAI_API_KEY")) +gemini_api_key = os.getenv("GEMINI_API_KEY") + +external_client = AsyncOpenAI( + api_key=gemini_api_key, + base_url="https://generativelanguage.googleapis.com/v1beta" + ) + +model = OpenAIChatCompletionsModel( + model="gemini-2.0-flash", + openai_client=external_client, + ) + +agent = Agent( + name="Assistant", + model=model, + ) +``` + +## Notes +- Set OPENAI_API_KEY in a .env file. +- View free traces at Openai Traces dashboard. + + ## External tracing processors list - [Weights & Biases](https://weave-docs.wandb.ai/guides/integrations/openai_agents) @@ -117,4 +152,3 @@ To customize this default setup, to send traces to alternative or additional bac - [Okahu-Monocle](https://github.com/monocle2ai/monocle) - [Galileo](https://v2docs.galileo.ai/integrations/openai-agent-integration#openai-agent-integration) - [Portkey AI](https://portkey.ai/docs/integrations/agents/openai-agents) -- [LangDB AI](https://docs.langdb.ai/getting-started/working-with-agent-frameworks/working-with-openai-agents-sdk) From c8d510c74e8cc208fa2d92f69e39cd830f59d992 Mon Sep 17 00:00:00 2001 From: Umm e Habiba <161445850+UmmeHabiba1312@users.noreply.github.com> Date: Fri, 1 Aug 2025 11:27:23 +0500 Subject: [PATCH 2/3] docs: update tracing example and description for non-OpenAI models --- docs/tracing.md | 32 ++++++++++++-------------------- 1 file changed, 12 insertions(+), 20 deletions(-) diff --git a/docs/tracing.md b/docs/tracing.md index cf11afadf..e9551fa8a 100644 --- a/docs/tracing.md +++ b/docs/tracing.md @@ -39,7 +39,7 @@ By default, the SDK traces the following: - Audio outputs (text-to-speech) are wrapped in a `speech_span()` - Related audio spans may be parented under a `speech_group_span()` -By default, the trace is named "Agent trace". You can set this name if you use `trace`, or you can can configure the name and other properties with the [`RunConfig`][agents.run.RunConfig]. +By default, the trace is named "Agent workflow". You can set this name if you use `trace`, or you can configure the name and other properties with the [`RunConfig`][agents.run.RunConfig]. In addition, you can set up [custom trace processors](#custom-tracing-processors) to push traces to other destinations (as a replacement, or secondary destination). @@ -98,37 +98,28 @@ To customize this default setup, to send traces to alternative or additional bac 2. [`set_trace_processors()`][agents.tracing.set_trace_processors] lets you **replace** the default processors with your own trace processors. This means traces will not be sent to the OpenAI backend unless you include a `TracingProcessor` that does so. -## Tracing with Non-OpenAI LLMs +## Tracing with Non-OpenAI Models -You can use an OpenAI API key with non-OpenAI LLMs to enable free tracing on the Openai Traces dashboard without disabling tracing. +You can use an OpenAI API key with non-OpenAI Models to enable free tracing in the OpenAI Traces dashboard without needing to disable tracing. ```python -from agents import set_tracing_export_api_key, Agent, Runner, AsyncOpenAI, OpenAIChatCompletionsModel -from dotenv import load_dotenv -load_dotenv() -import os +from agents import set_tracing_export_api_key, Agent, Runner +from agents.extensions.models.litellm_model import LitellmModel -set_tracing_export_api_key(os.getenv("OPENAI_API_KEY")) -gemini_api_key = os.getenv("GEMINI_API_KEY") +set_tracing_export_api_key("OPENAI_API_KEY") -external_client = AsyncOpenAI( - api_key=gemini_api_key, - base_url="https://generativelanguage.googleapis.com/v1beta" - ) - -model = OpenAIChatCompletionsModel( - model="gemini-2.0-flash", - openai_client=external_client, - ) +model = LitellmModel( + model="your-model-name", + api_key="your-api-key", +) agent = Agent( name="Assistant", model=model, - ) +) ``` ## Notes -- Set OPENAI_API_KEY in a .env file. - View free traces at Openai Traces dashboard. @@ -152,3 +143,4 @@ agent = Agent( - [Okahu-Monocle](https://github.com/monocle2ai/monocle) - [Galileo](https://v2docs.galileo.ai/integrations/openai-agent-integration#openai-agent-integration) - [Portkey AI](https://portkey.ai/docs/integrations/agents/openai-agents) +- [LangDB AI](https://docs.langdb.ai/getting-started/working-with-agent-frameworks/working-with-openai-agents-sdk) From 86792c9deaf521f9236380d609a72d18dd43c89e Mon Sep 17 00:00:00 2001 From: Kazuhiro Sera Date: Fri, 1 Aug 2025 16:19:43 +0900 Subject: [PATCH 3/3] Update docs/tracing.md --- docs/tracing.md | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/docs/tracing.md b/docs/tracing.md index e9551fa8a..b4f440778 100644 --- a/docs/tracing.md +++ b/docs/tracing.md @@ -103,10 +103,12 @@ To customize this default setup, to send traces to alternative or additional bac You can use an OpenAI API key with non-OpenAI Models to enable free tracing in the OpenAI Traces dashboard without needing to disable tracing. ```python +import os from agents import set_tracing_export_api_key, Agent, Runner from agents.extensions.models.litellm_model import LitellmModel -set_tracing_export_api_key("OPENAI_API_KEY") +tracing_api_key = os.environ["OPENAI_API_KEY"] +set_tracing_export_api_key(tracing_api_key) model = LitellmModel( model="your-model-name",