Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 11 additions & 1 deletion examples/tracing/agent/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,16 @@ The tracing implementation will log spans to the console for all agent methods.

### Exporting to Collector

If desired, [install Jaeger locally](https://www.jaegertracing.io/docs/2.5/getting-started/) and then update the `mcp_agent.config.yaml` for this example to have `otel.otlp_settings.endpoint` point to the collector endpoint (e.g. `http://localhost:4318/v1/traces` is the default for Jaeger via HTTP).
If desired, [install Jaeger locally](https://www.jaegertracing.io/docs/2.5/getting-started/) and then update the `mcp_agent.config.yaml` to include a typed OTLP exporter with the collector endpoint (e.g. `http://localhost:4318/v1/traces`):

```yaml
otel:
enabled: true
exporters:
- type: console
- type: file
- type: otlp
endpoint: "http://localhost:4318/v1/traces"
```

<img width="2160" alt="Image" src="https://github.com/user-attachments/assets/93ffc4e5-f255-43a9-be3a-755994fec809" />
11 changes: 7 additions & 4 deletions examples/tracing/agent/mcp_agent.config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -26,8 +26,11 @@ openai:

otel:
enabled: true
exporters: ["console", "file"]
# If running jaeger locally, uncomment the following lines and add "otlp" to the exporters list
# otlp_settings:
# endpoint: "http://localhost:4318/v1/traces"
exporters: [
{ type: console },
{ type: file },
# To export to a collector, also include:
# { type: otlp, endpoint: "http://localhost:4318/v1/traces" },

]
service_name: "BasicTracingAgentExample"
16 changes: 11 additions & 5 deletions examples/tracing/langfuse/README.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,6 @@
# Langfuse Trace Exporter Example

This example shows how to configure a Langfuse OTLP trace exporter for use in `mcp-agent` by configuring the
`otel.otlp_settings` with the expected endpoint and headers.
This example shows how to configure a Langfuse OTLP trace exporter for use in `mcp-agent` by adding a typed OTLP exporter with the expected endpoint and headers.
Following information from https://langfuse.com/integrations/native/opentelemetry

## `1` App set up
Expand Down Expand Up @@ -47,7 +46,7 @@ Obtain a secret and public API key for your desired Langfuse project and then ge
echo -n "pk-your-public-key:sk-your-secret-key" | base64
```

In `mcp_agent.secrets.yaml` set the Authorization header for OTLP:
In `mcp_agent.secrets.yaml` set the Authorization header for OTLP (merged automatically with the typed exporter):

```yaml
otel:
Expand All @@ -56,8 +55,15 @@ otel:
Authorization: "Basic AUTH_STRING"
```

Lastly, ensure the proper trace endpoint is configured for the `otel.otlp_settings.endpoint` in `mcp_agent.yaml` for the relevant
Langfuse data region.
Lastly, ensure the proper trace endpoint is configured in the typed exporter in `mcp_agent.config.yaml` for your Langfuse region, e.g.:

```yaml
otel:
enabled: true
exporters:
- type: otlp
endpoint: "https://us.cloud.langfuse.com/api/public/otel/v1/traces"
```

## `4` Run locally

Expand Down
6 changes: 2 additions & 4 deletions examples/tracing/langfuse/mcp_agent.config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -26,8 +26,6 @@ openai:

otel:
enabled: true
exporters: ["otlp"]
otlp_settings:
endpoint: "https://us.cloud.langfuse.com/api/public/otel/v1/traces"
# Set Authorization header with API key in mcp_agent.secrets.yaml
exporters: [{ type: otlp, endpoint: "https://us.cloud.langfuse.com/api/public/otel/v1/traces" }]
# Set Authorization header with API key in mcp_agent.secrets.yaml
service_name: "BasicTracingLangfuseExample"
1 change: 1 addition & 0 deletions examples/tracing/langfuse/mcp_agent.secrets.yaml.example
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ anthropic:
api_key: anthropic_api_key

otel:
# Headers are merged with typed OTLP exporter settings
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How would this look for the new format? Would we need to duplicate the -type and endpoint here to map to the correct one (in case of multiple)?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, headers would need to be specified for each endpoint, which I think is correct.

class OTLPExporterSettings(OpenTelemetryExporterBase):
    type: Literal["otlp"] = "otlp"
    endpoint: str | None = None
    headers: Dict[str, str] | None = None

otlp_settings:
headers:
Authorization: "Basic <Langfuse AUTH_STRING>"
25 changes: 24 additions & 1 deletion examples/tracing/llm/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,29 @@ The tracing implementation will log spans to the console for all AugmentedLLM me

### Exporting to Collector

If desired, [install Jaeger locally](https://www.jaegertracing.io/docs/2.5/getting-started/) and then update the `mcp_agent.config.yaml` for this example to have `otel.otlp_settings.endpoint` point to the collector endpoint (e.g. `http://localhost:4318/v1/traces` is the default for Jaeger via HTTP).
If desired, [install Jaeger locally](https://www.jaegertracing.io/docs/2.5/getting-started/):

```
docker run
--rm --name jaeger \
-p 16686:16686 \
-p 4317:4317 \
-p 4318:4318 \
-p 5778:5778 \
-p 9411:9411 \
jaegertracing/jaeger:2.5.0
```

Then update the `mcp_agent.config.yaml` to include a typed OTLP exporter with the collector endpoint (e.g. `http://localhost:4318/v1/traces`):

```yaml
otel:
enabled: true
exporters:
- type: console
- type: file
- type: otlp
endpoint: "http://localhost:4318/v1/traces"
```

<img width="2160" alt="Image" src="https://github.com/user-attachments/assets/f2d1cedf-6729-4ce1-9530-ec9d5653103d" />
50 changes: 42 additions & 8 deletions examples/tracing/llm/main.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
import asyncio
import time
from typing import Dict

from pydantic import BaseModel

Expand All @@ -16,13 +17,25 @@
app = MCPApp(name="llm_tracing_example")


class CountryInfo(BaseModel):
"""Model representing structured data for country information."""
class CountryRecord(BaseModel):
"""Single country's structured data."""

capital: str
population: int


class CountryInfo(BaseModel):
"""Structured response containing multiple countries."""

countries: Dict[str, CountryRecord]

def summary(self) -> str:
return ", ".join(
f"{country}: {info.capital} (pop {info.population:,})"
for country, info in self.countries.items()
)


async def llm_tracing():
async with app.run() as agent_app:
logger = agent_app.logger
Expand Down Expand Up @@ -51,11 +64,18 @@ async def _trace_openai():
result_structured = await openai_llm.generate_structured(
MessageParam(
role="user",
content="Give JSON representing the the capitals and populations of the following countries: France, Ireland, Italy",
content=(
"Return JSON under a top-level `countries` object. "
"Within `countries`, each key should be the country name (France, Ireland, Italy) "
"with values containing `capital` and `population`."
),
),
response_model=CountryInfo,
)
logger.info(f"openai_llm structured result: {result_structured}")
logger.info(
"openai_llm structured result",
data=result_structured.model_dump(mode="json"),
)

async def _trace_anthropic():
# Agent-integrated LLM (Anthropic)
Expand All @@ -73,11 +93,18 @@ async def _trace_anthropic():
result_structured = await llm.generate_structured(
MessageParam(
role="user",
content="Give JSON representing the the capitals and populations of the following countries: France, Germany, Belgium",
content=(
"Return JSON under a top-level `countries` object. "
"Within `countries`, each key should be the country name (France, Germany, Belgium) "
"with values containing `capital` and `population`."
),
),
response_model=CountryInfo,
)
logger.info(f"llm_agent structured result: {result_structured}")
logger.info(
"llm_agent structured result",
data=result_structured.model_dump(mode="json"),
)

async def _trace_azure():
# Azure
Expand All @@ -93,11 +120,18 @@ async def _trace_azure():
result_structured = await azure_llm.generate_structured(
MessageParam(
role="user",
content="Give JSON representing the the capitals and populations of the following countries: Spain, Portugal, Italy",
content=(
"Return JSON under a top-level `countries` object. "
"Within `countries`, each key should be the country name (Spain, Portugal, Italy) "
"with values containing `capital` and `population`."
),
),
response_model=CountryInfo,
)
logger.info(f"azure_llm structured result: {result_structured}")
logger.info(
"azure_llm structured result",
data=result_structured.model_dump(mode="json"),
)

await asyncio.gather(
_trace_openai(),
Expand Down
11 changes: 7 additions & 4 deletions examples/tracing/llm/mcp_agent.config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -26,8 +26,11 @@ openai:

otel:
enabled: true
exporters: ["console", "file"]
# If running jaeger locally, uncomment the following lines and add "otlp" to the exporters list
# otlp_settings:
# endpoint: "http://localhost:4318/v1/traces"
exporters: [
{ type: console },
{ type: file },
# To export to a collector, also include:
# { type: otlp, endpoint: "http://localhost:4318/v1/traces" },
]

service_name: "BasicTracingLLMExample"
10 changes: 9 additions & 1 deletion examples/tracing/mcp/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,15 @@ Then open `mcp_agent.secrets.yaml` and add your api key for your preferred LLM f

## `3` Configure Jaeger Collector

[Run Jaeger locally](https://www.jaegertracing.io/docs/2.5/getting-started/) and then update the `mcp_agent.config.yaml` for this example to have `otel.otlp_settings.endpoint` point to the collector endpoint (e.g. `http://localhost:4318/v1/traces` is the default for Jaeger via HTTP).
[Run Jaeger locally](https://www.jaegertracing.io/docs/2.5/getting-started/) and then update the `mcp_agent.config.yaml` to include a typed OTLP exporter with the collector endpoint (e.g. `http://localhost:4318/v1/traces`):

```yaml
otel:
enabled: true
exporters:
- type: otlp
endpoint: "http://localhost:4318/v1/traces"
```

## `4` Run locally

Expand Down
5 changes: 1 addition & 4 deletions examples/tracing/mcp/mcp_agent.config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -17,8 +17,5 @@ openai:

otel:
enabled: true
exporters: ["otlp"]
# If running jaeger locally, uncomment the following lines and add "otlp" to the exporters list
otlp_settings:
endpoint: "http://localhost:4318/v1/traces"
exporters: [{ type: otlp, endpoint: "http://localhost:4318/v1/traces" }]
service_name: "MCPAgentSSEExample"
10 changes: 9 additions & 1 deletion examples/tracing/temporal/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,15 @@ To run any of these examples, you'll need to:

3. Configure Jaeger Collector

[Run Jaeger locally](https://www.jaegertracing.io/docs/2.5/getting-started/) and then ensure the `mcp_agent.config.yaml` for this example has `otel.otlp_settings.endpoint` point to the collector endpoint (e.g. `http://localhost:4318/v1/traces` is the default for Jaeger via HTTP).
[Run Jaeger locally](https://www.jaegertracing.io/docs/2.5/getting-started/) and then ensure the `mcp_agent.config.yaml` for this example includes a typed OTLP exporter with the collector endpoint:

```yaml
otel:
enabled: true
exporters:
- type: otlp
endpoint: "http://localhost:4318/v1/traces"
```

4. In a separate terminal, start the worker:

Expand Down
8 changes: 5 additions & 3 deletions examples/tracing/temporal/mcp_agent.config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,9 @@ openai:

otel:
enabled: true
exporters: ["file", "otlp"]
otlp_settings:
endpoint: "http://localhost:4318/v1/traces"
exporters:
[
{ type: file },
{ type: otlp, endpoint: "http://localhost:4318/v1/traces" },
]
service_name: "TemporalTracingExample"
Original file line number Diff line number Diff line change
Expand Up @@ -24,8 +24,17 @@ openai:

otel:
enabled: true
exporters: ["file"]
# If running jaeger locally, uncomment the following lines and add "otlp" to the exporters list
# otlp_settings:
# endpoint: "http://localhost:4318/v1/traces"
exporters: [
{
type: file,
path_settings:
{
path_pattern: "traces/mcp-agent-trace-{unique_id}.jsonl",
unique_id: "timestamp",
timestamp_format: "%Y%m%d_%H%M%S",
},
},
# To export to a collector, also include:
# { type: otlp, endpoint: "http://localhost:4318/v1/traces" },
]
service_name: "AdaptiveWorkflowExample"
Original file line number Diff line number Diff line change
Expand Up @@ -11,14 +11,14 @@ execution_engine: asyncio

# Logging configuration
logger:
type: console # Log output type (console, file, or http)
level: debug # Logging level (debug, info, warning, error)
batch_size: 100 # Number of logs to batch before sending
flush_interval: 2 # Interval in seconds to flush logs
max_queue_size: 2048 # Maximum queue size for buffered logs
http_endpoint: # Optional: HTTP endpoint for remote logging
http_headers: # Optional: Headers for HTTP logging
http_timeout: 5 # Timeout for HTTP logging requests
type: console # Log output type (console, file, or http)
level: debug # Logging level (debug, info, warning, error)
batch_size: 100 # Number of logs to batch before sending
flush_interval: 2 # Interval in seconds to flush logs
max_queue_size: 2048 # Maximum queue size for buffered logs
http_endpoint: # Optional: HTTP endpoint for remote logging
http_headers: # Optional: Headers for HTTP logging
http_timeout: 5 # Timeout for HTTP logging requests

# MCP (Model Context Protocol) server configuration
mcp:
Expand All @@ -36,13 +36,14 @@ mcp:
# OpenAI configuration
openai:
# API keys are stored in mcp_agent.secrets.yaml (gitignored for security)
default_model: gpt-5 # Default model for OpenAI API calls
default_model: gpt-5 # Default model for OpenAI API calls

# OpenTelemetry (OTEL) configuration for distributed tracing
otel:
enabled: false # Set to true to enable tracing
exporters: ["console"] # Trace exporters (console, otlp)
# Uncomment below to export traces to Jaeger running locally
# otlp_settings:
# endpoint: "http://localhost:4318/v1/traces"
service_name: "WorkflowEvaluatorOptimizerExample" # Service name in traces
enabled: false
exporters: [
{ type: console },
# To export to a collector, also include:
# { type: otlp, endpoint: "http://localhost:4318/v1/traces" }
]
service_name: "WorkflowEvaluatorOptimizerExample"
Original file line number Diff line number Diff line change
Expand Up @@ -21,8 +21,9 @@ openai:

otel:
enabled: false
exporters: ["console"]
# If running jaeger locally, uncomment the following lines and add "otlp" to the exporters list
# otlp_settings:
# endpoint: "http://localhost:4318/v1/traces"
exporters: [
{ type: console },
# To export to a collector, also include:
# { type: otlp, endpoint: "http://localhost:4318/v1/traces" }
]
service_name: "WorkflowIntentClassifierExample"
Original file line number Diff line number Diff line change
Expand Up @@ -26,8 +26,9 @@ openai:

otel:
enabled: false
exporters: ["console"]
# If running jaeger locally, uncomment the following lines and add "otlp" to the exporters list
# otlp_settings:
# endpoint: "http://localhost:4318/v1/traces"
exporters: [
{ type: console },
# To export to a collector, also include:
# { type: otlp, endpoint: "http://localhost:4318/v1/traces" }
]
service_name: "WorkflowOrchestratorWorkerExample"
Loading
Loading