This example demonstrates automatic telemetry collection in PicoAgents using OpenTelemetry.
When you enable OpenTelemetry, PicoAgents automatically collects:
- Traces: Spans for agent operations, LLM calls, and tool executions
- Metrics: Token usage histograms and operation duration
- Semantic Conventions: Follows OpenTelemetry Gen-AI standards
# From the repository root
cd picoagents
pip install -e ".[otel]"# From the repository root
cd examples/otel
docker-compose up -dThis starts Jaeger on:
- UI: http://localhost:16686
- OTLP endpoint: http://localhost:4318
export PICOAGENTS_ENABLE_OTEL=true
export OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4318
export OTEL_SERVICE_NAME=picoagents-example
export OPENAI_API_KEY=your-api-keypython agent_with_telemetry.py- Open http://localhost:16686
- Select service:
picoagents-example - Click "Find Traces"
- Explore the trace hierarchy
agent weather_assistant
├─ chat gpt-4o-mini
│ └─ Attributes: gen_ai.usage.input_tokens, gen_ai.usage.output_tokens
├─ tool get_weather
│ └─ Attributes: gen_ai.tool.name, gen_ai.tool.success
└─ chat gpt-4o-mini
└─ Final response with usage stats
gen_ai.client.token.usage: Token consumption histogramsgen_ai.client.operation.duration: Operation latency
| Variable | Default | Description |
|---|---|---|
PICOAGENTS_ENABLE_OTEL |
false |
Enable OpenTelemetry |
OTEL_EXPORTER_OTLP_ENDPOINT |
http://localhost:4318 |
OTLP endpoint URL |
OTEL_SERVICE_NAME |
picoagents |
Service name for traces |
docker-compose downPicoAgents works with any OTLP-compatible backend:
export OTEL_EXPORTER_OTLP_ENDPOINT=https://api.datadoghq.com
export DD_API_KEY=your-datadog-api-keyexport OTEL_EXPORTER_OTLP_ENDPOINT=https://api.honeycomb.io
export HONEYCOMB_API_KEY=your-api-keyexport OTEL_EXPORTER_OTLP_ENDPOINT=https://otlp.nr-data.net
export NEW_RELIC_LICENSE_KEY=your-license-key