Skip to content

Commit d669d41

Browse files
docs: update openai agents (#2121)
1 parent a1e66bb commit d669d41

File tree

6 files changed

+61
-72
lines changed

6 files changed

+61
-72
lines changed

cookbook/integration_openai-agents.ipynb

Lines changed: 34 additions & 37 deletions
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,7 @@
2121
"source": [
2222
"## 1. Install Dependencies\n",
2323
"\n",
24-
"Below we install the `openai-agents` library (the OpenAI Agents SDK), and the `pydantic-ai[logfire]` OpenTelemetry instrumentation."
24+
"Below we install the `openai-agents` library (the OpenAI Agents SDK), and the [OpenInference OpenAI Agents instrumentation](https://github.com/Arize-ai/openinference/tree/main/python/instrumentation/openinference-instrumentation-openai-agents) library."
2525
]
2626
},
2727
{
@@ -30,7 +30,7 @@
3030
"metadata": {},
3131
"outputs": [],
3232
"source": [
33-
"%pip install openai-agents langfuse nest_asyncio \"pydantic-ai[logfire]\""
33+
"%pip install openai-agents langfuse nest_asyncio openinference-instrumentation-openai-agents"
3434
]
3535
},
3636
{
@@ -64,9 +64,7 @@
6464
"cell_type": "markdown",
6565
"metadata": {},
6666
"source": [
67-
"## 3. Instrumenting the Agent\n",
68-
"\n",
69-
"Pydantic Logfire offers an instrumentation for the OpenAi Agent SDK. We use this to send traces to Langfuse."
67+
"## 3. Instrumenting the Agent"
7068
]
7169
},
7270
{
@@ -79,21 +77,22 @@
7977
"nest_asyncio.apply()"
8078
]
8179
},
80+
{
81+
"cell_type": "markdown",
82+
"metadata": {},
83+
"source": [
84+
"Now, we initialize the [OpenInference OpenAI Agents instrumentation](https://github.com/Arize-ai/openinference/tree/main/python/instrumentation/openinference-instrumentation-openai-agents). This third-party instrumentation automatically captures OpenAI Agents operations and exports OpenTelemetry (OTel) spans to Langfuse."
85+
]
86+
},
8287
{
8388
"cell_type": "code",
84-
"execution_count": 4,
89+
"execution_count": null,
8590
"metadata": {},
8691
"outputs": [],
8792
"source": [
88-
"import logfire\n",
93+
"from openinference.instrumentation.openai_agents import OpenAIAgentsInstrumentor\n",
8994
"\n",
90-
"# Configure logfire instrumentation.\n",
91-
"logfire.configure(\n",
92-
" service_name='my_agent_service',\n",
93-
" send_to_logfire=False,\n",
94-
")\n",
95-
"# This method automatically patches the OpenAI Agents SDK to send logs via OTLP to Langfuse.\n",
96-
"logfire.instrument_openai_agents()"
95+
"OpenAIAgentsInstrumentor().instrument()"
9796
]
9897
},
9998
{
@@ -157,7 +156,7 @@
157156
"source": [
158157
"![Example trace in Langfuse](https://langfuse.com/images/cookbook/integration_openai-agents/openai-agent-example-trace.png)\n",
159158
"\n",
160-
"**Example**: [Langfuse Trace](https://cloud.langfuse.com/project/cloramnkj0002jz088vzn1ja4/traces/019593c7330da67c08219bd1c75b7a6d?timestamp=2025-03-14T08%3A31%3A00.365Z&observation=81e525d819153eed)\n",
159+
"**Example**: [Langfuse Trace](https://cloud.langfuse.com/project/cloramnkj0002jz088vzn1ja4/traces/483a2afb22b41ee49d7d1e5b41b4967a?timestamp=2025-09-30T09%3A01%3A29.194Z&observation=e3e3580728f3fec4)\n",
161160
"\n",
162161
"Clicking the link above (or your own project link) lets you view all sub-spans, token usage, latencies, etc., for debugging or optimization."
163162
]
@@ -211,7 +210,7 @@
211210
"source": [
212211
"![Example trace in Langfuse](https://langfuse.com/images/cookbook/integration_openai-agents/openai-agent-example-trace-handoff.png)\n",
213212
"\n",
214-
"**Example**: [Langfuse Trace](https://cloud.langfuse.com/project/cloramnkj0002jz088vzn1ja4/traces/019593c74429a6d0489e9259703a1148?timestamp=2025-03-14T08%3A31%3A04.745Z&observation=e83609282c443b0d)"
213+
"**Example**: [Langfuse Trace](https://cloud.langfuse.com/project/cloramnkj0002jz088vzn1ja4/traces/c376e44920527b875add9d97b4ed9312?observation=94a1dc00d7067ae8&timestamp=2025-09-30T09%3A03%3A51.191Z)"
215214
]
216215
},
217216
{
@@ -257,7 +256,7 @@
257256
"source": [
258257
"![Example trace in Langfuse](https://langfuse.com/images/cookbook/integration_openai-agents/openai-agent-example-trace-function.png)\n",
259258
"\n",
260-
"**Example**: [Langfuse Trace](https://cloud.langfuse.com/project/cloramnkj0002jz088vzn1ja4/traces/019593c74a162f93387d9261b01f9ca9?timestamp=2025-03-14T08%3A31%3A06.262Z&observation=0e2988966786cdf4)\n",
259+
"**Example**: [Langfuse Trace](https://cloud.langfuse.com/project/cloramnkj0002jz088vzn1ja4/traces/fee618f96dc31e0ca38b2f7b26eb8b29?timestamp=2025-09-30T09%3A03%3A59.962Z&observation=5b99c3e3411ed17b)\n",
261260
"\n",
262261
"When viewing the trace, you’ll see a span capturing the function call `get_weather` and the arguments passed."
263262
]
@@ -299,7 +298,7 @@
299298
"source": [
300299
"![Example trace in Langfuse](https://langfuse.com/images/cookbook/integration_openai-agents/openai-agent-example-trace-grouped.png)\n",
301300
"\n",
302-
"**Example**: [Langfuse Trace](https://cloud.langfuse.com/project/cloramnkj0002jz088vzn1ja4/traces/019593c7523686ff7667b85673d033bf?timestamp=2025-03-14T08%3A31%3A08.342Z&observation=d69e377f62b1d331)\n",
301+
"**Example**: [Langfuse Trace](https://cloud.langfuse.com/project/cloramnkj0002jz088vzn1ja4/traces/afa1ae379b6c3a5333e2bab357f31cdc?timestamp=2025-09-30T09%3A04%3A06.678Z&observation=496efdc060a5399b)\n",
303302
"\n",
304303
"Each child call is represented as a sub-span under the top-level **Joke workflow** span, making it easy to see the entire conversation or sequence of calls."
305304
]
@@ -325,22 +324,28 @@
325324
"source": [
326325
"from contextvars import ContextVar\n",
327326
"from typing import Optional\n",
328-
"from opentelemetry import context as context_api\n",
327+
"from opentelemetry import context as context_api, trace\n",
328+
"from opentelemetry.sdk.trace import TracerProvider\n",
329329
"from opentelemetry.sdk.trace.export import Span, SpanProcessor\n",
330330
"\n",
331331
"prompt_info_var = ContextVar(\"prompt_info\", default=None)\n",
332332
"\n",
333+
"# Make sure to set the name of the generation spans in your trace\n",
333334
"class LangfuseProcessor(SpanProcessor):\n",
334-
" def on_start(\n",
335-
" self,\n",
336-
" span: 'Span',\n",
337-
" parent_context: Optional[context_api.Context] = None,\n",
338-
" ) -> None:\n",
339-
" if span.name.startswith('Responses API'): # The name of the generation spans in your OpenAI Agent trace \n",
335+
" def on_start(self, span: \"Span\", parent_context: Optional[context_api.Context] = None) -> None:\n",
336+
" if span.name.startswith(\"response\"): # The name of the generation spans in your trace \n",
340337
" prompt_info = prompt_info_var.get()\n",
341338
" if prompt_info:\n",
342-
" span.set_attribute('langfuse.prompt.name', prompt_info.get(\"name\"))\n",
343-
" span.set_attribute('langfuse.prompt.version', prompt_info.get(\"version\"))"
339+
" span.set_attribute(\"langfuse.prompt.name\", prompt_info.get(\"name\"))\n",
340+
" span.set_attribute(\"langfuse.prompt.version\", prompt_info.get(\"version\"))\n",
341+
"\n",
342+
"\n",
343+
"# 1) Register your processor BEFORE instantiating Langfuse\n",
344+
"trace.get_tracer_provider().add_span_processor(LangfuseProcessor())\n",
345+
"\n",
346+
"# 2) Now bring up Langfuse (it will attach its own span processor/exporter to the same provider)\n",
347+
"from langfuse import Langfuse, get_client\n",
348+
"langfuse = get_client()"
344349
]
345350
},
346351
{
@@ -349,17 +354,9 @@
349354
"metadata": {},
350355
"outputs": [],
351356
"source": [
352-
"import logfire\n",
353-
"from langfuse import get_client\n",
354-
"\n",
355-
"logfire.configure(\n",
356-
" service_name='my_agent_service',\n",
357-
" additional_span_processors=[LangfuseProcessor()], # Passing the LangfuseProcessor to the logfire configuration will automatically link the prompt to the trace\n",
358-
" send_to_logfire=False,\n",
359-
")\n",
357+
"from openinference.instrumentation.openai_agents import OpenAIAgentsInstrumentor\n",
360358
"\n",
361-
"logfire.instrument_openai_agents()\n",
362-
"langfuse = get_client()"
359+
"OpenAIAgentsInstrumentor().instrument()"
363360
]
364361
},
365362
{

pages/integrations/frameworks/openai-agents.mdx

Lines changed: 27 additions & 35 deletions
Original file line numberDiff line numberDiff line change
@@ -17,11 +17,11 @@ This notebook demonstrates how to **integrate Langfuse** into your **OpenAI Agen
1717
1818
## 1. Install Dependencies
1919

20-
Below we install the `openai-agents` library (the OpenAI Agents SDK), and the `pydantic-ai[logfire]` OpenTelemetry instrumentation.
20+
Below we install the `openai-agents` library (the OpenAI Agents SDK), and the [OpenInference OpenAI Agents instrumentation](https://github.com/Arize-ai/openinference/tree/main/python/instrumentation/openinference-instrumentation-openai-agents) library.
2121

2222

2323
```python
24-
%pip install openai-agents langfuse nest_asyncio "pydantic-ai[logfire]"
24+
%pip install openai-agents langfuse nest_asyncio openinference-instrumentation-openai-agents
2525
```
2626

2727
## 2. Configure Environment & Langfuse Credentials
@@ -44,25 +44,19 @@ os.environ["OPENAI_API_KEY"] = "sk-proj-..."
4444

4545
## 3. Instrumenting the Agent
4646

47-
Pydantic Logfire offers an instrumentation for the OpenAi Agent SDK. We use this to send traces to Langfuse.
48-
4947

5048
```python
5149
import nest_asyncio
5250
nest_asyncio.apply()
5351
```
5452

53+
Now, we initialize the [OpenInference OpenAI Agents instrumentation](https://github.com/Arize-ai/openinference/tree/main/python/instrumentation/openinference-instrumentation-openai-agents). This third-party instrumentation automatically captures OpenAI Agents operations and exports OpenTelemetry (OTel) spans to Langfuse.
54+
5555

5656
```python
57-
import logfire
57+
from openinference.instrumentation.openai_agents import OpenAIAgentsInstrumentor
5858

59-
# Configure logfire instrumentation.
60-
logfire.configure(
61-
service_name='my_agent_service',
62-
send_to_logfire=False,
63-
)
64-
# This method automatically patches the OpenAI Agents SDK to send logs via OTLP to Langfuse.
65-
logfire.instrument_openai_agents()
59+
OpenAIAgentsInstrumentor().instrument()
6660
```
6761

6862
Now initialize the Langfuse client. `get_client()` initializes the Langfuse client using the credentials provided in the environment variables.
@@ -105,7 +99,7 @@ await loop.create_task(main())
10599

106100
![Example trace in Langfuse](https://langfuse.com/images/cookbook/integration_openai-agents/openai-agent-example-trace.png)
107101

108-
**Example**: [Langfuse Trace](https://cloud.langfuse.com/project/cloramnkj0002jz088vzn1ja4/traces/019593c7330da67c08219bd1c75b7a6d?timestamp=2025-03-14T08%3A31%3A00.365Z&observation=81e525d819153eed)
102+
**Example**: [Langfuse Trace](https://cloud.langfuse.com/project/cloramnkj0002jz088vzn1ja4/traces/483a2afb22b41ee49d7d1e5b41b4967a?timestamp=2025-09-30T09%3A01%3A29.194Z&observation=e3e3580728f3fec4)
109103

110104
Clicking the link above (or your own project link) lets you view all sub-spans, token usage, latencies, etc., for debugging or optimization.
111105

@@ -145,7 +139,7 @@ print(result.final_output)
145139

146140
![Example trace in Langfuse](https://langfuse.com/images/cookbook/integration_openai-agents/openai-agent-example-trace-handoff.png)
147141

148-
**Example**: [Langfuse Trace](https://cloud.langfuse.com/project/cloramnkj0002jz088vzn1ja4/traces/019593c74429a6d0489e9259703a1148?timestamp=2025-03-14T08%3A31%3A04.745Z&observation=e83609282c443b0d)
142+
**Example**: [Langfuse Trace](https://cloud.langfuse.com/project/cloramnkj0002jz088vzn1ja4/traces/c376e44920527b875add9d97b4ed9312?observation=94a1dc00d7067ae8&timestamp=2025-09-30T09%3A03%3A51.191Z)
149143

150144
## 6. Functions Example
151145

@@ -178,7 +172,7 @@ await loop.create_task(main())
178172

179173
![Example trace in Langfuse](https://langfuse.com/images/cookbook/integration_openai-agents/openai-agent-example-trace-function.png)
180174

181-
**Example**: [Langfuse Trace](https://cloud.langfuse.com/project/cloramnkj0002jz088vzn1ja4/traces/019593c74a162f93387d9261b01f9ca9?timestamp=2025-03-14T08%3A31%3A06.262Z&observation=0e2988966786cdf4)
175+
**Example**: [Langfuse Trace](https://cloud.langfuse.com/project/cloramnkj0002jz088vzn1ja4/traces/fee618f96dc31e0ca38b2f7b26eb8b29?timestamp=2025-09-30T09%3A03%3A59.962Z&observation=5b99c3e3411ed17b)
182176

183177
When viewing the trace, you’ll see a span capturing the function call `get_weather` and the arguments passed.
184178

@@ -207,7 +201,7 @@ await loop.create_task(main())
207201

208202
![Example trace in Langfuse](https://langfuse.com/images/cookbook/integration_openai-agents/openai-agent-example-trace-grouped.png)
209203

210-
**Example**: [Langfuse Trace](https://cloud.langfuse.com/project/cloramnkj0002jz088vzn1ja4/traces/019593c7523686ff7667b85673d033bf?timestamp=2025-03-14T08%3A31%3A08.342Z&observation=d69e377f62b1d331)
204+
**Example**: [Langfuse Trace](https://cloud.langfuse.com/project/cloramnkj0002jz088vzn1ja4/traces/afa1ae379b6c3a5333e2bab357f31cdc?timestamp=2025-09-30T09%3A04%3A06.678Z&observation=496efdc060a5399b)
211205

212206
Each child call is represented as a sub-span under the top-level **Joke workflow** span, making it easy to see the entire conversation or sequence of calls.
213207

@@ -223,37 +217,35 @@ If you manage your prompt with [Langfuse Prompt Management](https://langfuse.com
223217
```python
224218
from contextvars import ContextVar
225219
from typing import Optional
226-
from opentelemetry import context as context_api
220+
from opentelemetry import context as context_api, trace
221+
from opentelemetry.sdk.trace import TracerProvider
227222
from opentelemetry.sdk.trace.export import Span, SpanProcessor
228223

229224
prompt_info_var = ContextVar("prompt_info", default=None)
230225

226+
# Make sure to set the name of the generation spans in your trace
231227
class LangfuseProcessor(SpanProcessor):
232-
def on_start(
233-
self,
234-
span: 'Span',
235-
parent_context: Optional[context_api.Context] = None,
236-
) -> None:
237-
if span.name.startswith('Responses API'): # The name of the generation spans in your OpenAI Agent trace
228+
def on_start(self, span: "Span", parent_context: Optional[context_api.Context] = None) -> None:
229+
if span.name.startswith("response"): # The name of the generation spans in your trace
238230
prompt_info = prompt_info_var.get()
239231
if prompt_info:
240-
span.set_attribute('langfuse.prompt.name', prompt_info.get("name"))
241-
span.set_attribute('langfuse.prompt.version', prompt_info.get("version"))
232+
span.set_attribute("langfuse.prompt.name", prompt_info.get("name"))
233+
span.set_attribute("langfuse.prompt.version", prompt_info.get("version"))
234+
235+
236+
# 1) Register your processor BEFORE instantiating Langfuse
237+
trace.get_tracer_provider().add_span_processor(LangfuseProcessor())
238+
239+
# 2) Now bring up Langfuse (it will attach its own span processor/exporter to the same provider)
240+
from langfuse import Langfuse, get_client
241+
langfuse = get_client()
242242
```
243243

244244

245245
```python
246-
import logfire
247-
from langfuse import get_client
246+
from openinference.instrumentation.openai_agents import OpenAIAgentsInstrumentor
248247

249-
logfire.configure(
250-
service_name='my_agent_service',
251-
additional_span_processors=[LangfuseProcessor()], # Passing the LangfuseProcessor to the logfire configuration will automatically link the prompt to the trace
252-
send_to_logfire=False,
253-
)
254-
255-
logfire.instrument_openai_agents()
256-
langfuse = get_client()
248+
OpenAIAgentsInstrumentor().instrument()
257249
```
258250

259251

372 KB
Loading
424 KB
Loading
468 KB
Loading
437 KB
Loading

0 commit comments

Comments
 (0)