Skip to content
Merged
Show file tree
Hide file tree
Changes from 29 commits
Commits
Show all changes
40 commits
Select commit Hold shift + click to select a range
4b37996
Initial plan
Copilot Oct 13, 2025
0f94601
Add nagkumar91 to component_owners.yml and rename openai-agents to op…
Copilot Oct 13, 2025
36c61e9
Merge pull request #1 from nagkumar91/copilot/add-nagkumar91-to-compo…
nagkumar91 Oct 13, 2025
3bbc833
Capture agent content via child span aggregation
nagkumar91 Oct 13, 2025
f63cbbf
Capture agent content via child span aggregation
nagkumar91 Oct 13, 2025
48a7747
Add content capture demonstration
nagkumar91 Oct 13, 2025
d6be37d
uv lock update
nagkumar91 Oct 13, 2025
41e3725
Capture agent content via child span aggregation
nagkumar91 Oct 13, 2025
5c369c3
Capture agent content via child span aggregation
nagkumar91 Oct 13, 2025
c57d59b
Add content capture demonstration
nagkumar91 Oct 13, 2025
2c283a6
Align OpenAI agents samples with OTLP exporter
nagkumar91 Oct 13, 2025
dcf1fde
Merge remote-tracking branch 'origin/feature/openai-agents-content-ca…
nagkumar91 Oct 13, 2025
9e20abb
Merge branch 'main' into feature/openai-agents-content-capture
nagkumar91 Oct 13, 2025
5535cd3
Restore GenAI semantic processor instrumentation
nagkumar91 Oct 13, 2025
1daffb3
update changelog
nagkumar91 Oct 13, 2025
d666678
Remove OpenAI Agents event emission
nagkumar91 Oct 13, 2025
f75132f
Align constants with GenAI semantic conventions
nagkumar91 Oct 13, 2025
80f64b7
Add env fallbacks for agent configuration
nagkumar91 Oct 13, 2025
4acdc2b
Provide default agent configuration values
nagkumar91 Oct 13, 2025
a056a00
Fix agent span message aggregation
nagkumar91 Oct 14, 2025
c9aee0e
Merge branch 'main' into feature/openai-agents-content-capture
nagkumar91 Oct 14, 2025
4820302
Merge branch 'main' into feature/openai-agents-content-capture
nagkumar91 Oct 14, 2025
54675a0
Collapse GenAI helpers into span_processor
nagkumar91 Oct 14, 2025
8c4457d
Update span processor unit tests
nagkumar91 Oct 14, 2025
29fb58f
Propagate request model to agent spans
nagkumar91 Oct 14, 2025
d24dd3f
Rename chat spans with resolved models
nagkumar91 Oct 14, 2025
ce897b9
Annotate root spans with operation name
nagkumar91 Oct 14, 2025
3ab9179
Record finish reasons on generation spans
nagkumar91 Oct 14, 2025
2b9dc6c
Depend on util-genai helpers
nagkumar91 Oct 14, 2025
e01bc21
Merge branch 'main' into feature/openai-agents-content-capture
nagkumar91 Oct 14, 2025
f665c90
Normalize span naming constants
nagkumar91 Oct 14, 2025
4b85acd
Return parsed server attributes on error
nagkumar91 Oct 14, 2025
baa72fe
docs: add missing docstrings for span processor
nagkumar91 Oct 14, 2025
1697f3d
refactor: drop legacy span name overrides
nagkumar91 Oct 14, 2025
fae3b26
revert: restore original span processor
nagkumar91 Oct 14, 2025
c3a992e
refactor: remove legacy span overrides
nagkumar91 Oct 14, 2025
29c4ceb
fix: guard token histogram before recording metrics
nagkumar91 Oct 14, 2025
e887859
chore: reuse semantic attribute symbols in metrics
nagkumar91 Oct 14, 2025
cd0acb2
chore: remove server attr fallback
nagkumar91 Oct 14, 2025
bcefa75
Merge branch 'main' into feature/openai-agents-content-capture
nagkumar91 Oct 15, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -12,3 +12,5 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
([#3805](https://github.com/open-telemetry/opentelemetry-python-contrib/pull/3805))
- Implement OpenAI Agents span processing aligned with GenAI semantic conventions.
([#3817](https://github.com/open-telemetry/opentelemetry-python-contrib/pull/3817))
- Input and output according to GenAI spec.
([#3824](https://github.com/open-telemetry/opentelemetry-python-contrib/pull/3824))
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
# Copy to .env and add values before running the sample.
# Required for OpenAI client (only used if you swap in a real OpenAI call)
OPENAI_API_KEY=

# Optional overrides for span attributes / exporters
OTEL_SERVICE_NAME=openai-agents-content-capture-demo
OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4317
OTEL_EXPORTER_OTLP_PROTOCOL=grpc
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
# OpenAI Agents Content Capture Demo

This example exercises the `OpenAIAgentsInstrumentor` with message content
capture enabled, illustrating how prompts, responses, and tool payloads are
recorded on spans and span events.

> The demo uses the local tracing utilities from the `openai-agents`
> package—no outbound API calls are made.

## Prerequisites

1. Activate the repository virtual environment:

```bash
source ../../.venv/bin/activate
```

2. Copy `.env.example` to `.env` and provide any overrides you need (for example,
setting `OTEL_EXPORTER_OTLP_ENDPOINT`).
3. Ensure `openai-agents` is installed in the environment (it is included in
the shared development venv for this repository).

## Run the demo

```bash
python main.py
```

The script will:

- Configure the OpenTelemetry SDK with an OTLP exporter so spans reach your collector.
- Instrument the OpenAI Agents tracing hooks with content capture enabled.
- Simulate an agent invocation that performs a generation and a tool call.
- Print the resulting spans, attributes, and events (including JSON-encoded
prompts and responses) to stdout.

## Customisation tips

- Set `OTEL_SERVICE_NAME` before running to override the default service name.
- Adjust the OTLP exporter configuration (endpoint, protocol) through `.env`.
- Modify the prompts, tool payloads, or add additional spans in `run_workflow`
to explore different content capture scenarios.
Original file line number Diff line number Diff line change
@@ -0,0 +1,122 @@
"""
Content capture demo for the OpenAI Agents instrumentation.

This script spins up the instrumentation with message capture enabled and
simulates an agent invocation plus a tool call using the tracing helpers from
the ``openai-agents`` package. Spans are exported to the console so you can
inspect captured prompts, responses, and tool payloads without making any
OpenAI API calls.
"""

from __future__ import annotations

import json
import os
from typing import Any

from agents.tracing import agent_span, function_span, generation_span, trace
from dotenv import load_dotenv

from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import (
OTLPSpanExporter,
)
from opentelemetry.instrumentation.openai_agents import (
OpenAIAgentsInstrumentor,
)
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor

load_dotenv() # take environment variables from .env.


def configure_tracing() -> None:
"""Configure a tracer provider that exports spans via OTLP."""
resource = Resource.create(
{
"service.name": os.environ.get(
"OTEL_SERVICE_NAME", "openai-agents-content-capture-demo"
)
}
)
provider = TracerProvider(resource=resource)
provider.add_span_processor(BatchSpanProcessor(OTLPSpanExporter()))

# Instrument with explicit content capture mode to ensure prompts/responses are recorded.
OpenAIAgentsInstrumentor().instrument(
tracer_provider=provider,
capture_message_content="span_and_event",
system="openai",
agent_name="Travel Concierge",
base_url="https://api.openai.com/v1",
)


def dump(title: str, payload: Any) -> None:
"""Pretty-print helper used to show intermediate context."""
print(f"\n=== {title} ===")
print(json.dumps(payload, indent=2))


def run_workflow() -> None:
"""Simulate an agent workflow with a generation and a tool invocation."""
itinerary_prompt = [
{"role": "system", "content": "Plan high level travel itineraries."},
{
"role": "user",
"content": "I'm visiting Paris for 3 days in November.",
},
]

tool_args = {"city": "Paris", "date": "2025-11-12"}
tool_result = {
"forecast": "Mostly sunny, highs 15°C",
"packing_tips": ["light jacket", "comfortable shoes"],
}

with trace("travel-booking-workflow"):
with agent_span(name="travel_planner") as agent:
dump(
"Agent span started",
{"span_id": agent.span_id, "trace_id": agent.trace_id},
)

with generation_span(
input=itinerary_prompt,
output=[
{
"role": "assistant",
"content": (
"Day 1 visit the Louvre, Day 2 tour Versailles, "
"Day 3 explore Montmartre."
),
}
],
model="gpt-4o-mini",
usage={
"input_tokens": 128,
"output_tokens": 96,
"total_tokens": 224,
},
):
pass

with function_span(
name="fetch_weather",
input=json.dumps(tool_args),
output=tool_result,
):
pass

print(
"\nWorkflow complete – spans exported to the configured OTLP endpoint."
)


def main() -> None:
configure_tracing()
run_workflow()


if __name__ == "__main__":
main()
Original file line number Diff line number Diff line change
@@ -1,11 +1,5 @@
# Update this with your real OpenAI API key
OPENAI_API_KEY=sk-YOUR_API_KEY

# Uncomment and adjust if you use a non-default OTLP collector endpoint
# OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4317
# OTEL_EXPORTER_OTLP_PROTOCOL=grpc

OTEL_SERVICE_NAME=opentelemetry-python-openai-agents-manual

# Optionally override the agent name reported on spans
# OTEL_GENAI_AGENT_NAME=Travel Concierge
# Copy to .env and add real values before running main.py
OPENAI_API_KEY=
OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4317
OTEL_EXPORTER_OTLP_PROTOCOL=grpc
OTEL_SERVICE_NAME=openai-agents-manual-demo
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ Setup
python3 -m venv .venv
source .venv/bin/activate
pip install "python-dotenv[cli]"
pip install -r requirements.txt
uv pip install -r requirements.txt --prerelease=allow

Run
---
Expand All @@ -34,6 +34,8 @@ are applied:

dotenv run -- python main.py

Ensure ``OPENAI_API_KEY`` is present in your environment (or ``.env`` file); the OpenAI client raises ``OpenAIError`` if the key is missing.

The script automatically loads environment variables from ``.env`` so running
``python main.py`` directly also works if the shell already has the required
values exported.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ Setup
python3 -m venv .venv
source .venv/bin/activate
pip install "python-dotenv[cli]"
pip install -r requirements.txt
uv pip install -r requirements.txt --prerelease=allow

Run
---
Expand All @@ -34,6 +34,8 @@ instrumentation is activated automatically:

dotenv run -- opentelemetry-instrument python main.py

Ensure ``OPENAI_API_KEY`` is set in your shell or `.env`; the OpenAI client raises ``OpenAIError`` if the key is missing.

Because ``main.py`` invokes ``load_dotenv``, running ``python main.py`` directly
also works when the required environment variables are already exported.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,8 @@ classifiers = [
dependencies = [
"opentelemetry-api >= 1.37",
"opentelemetry-instrumentation >= 0.58b0",
"opentelemetry-semantic-conventions >= 0.58b0"
"opentelemetry-semantic-conventions >= 0.58b0",
"opentelemetry-util-genai"
]

[project.optional-dependencies]
Expand Down
Loading