|
| 1 | +# Basic MCP Usage Examples |
| 2 | + |
| 3 | +This directory contains examples of integrating Model Context Protocol (MCP) with various LLM agent frameworks. |
| 4 | + |
| 5 | +Each script demonstrates how to connect to a single local MCP server and use it with a different agent framework. |
| 6 | + |
| 7 | +### Basic MCP Architecture |
| 8 | + |
| 9 | +```mermaid |
| 10 | +graph LR |
| 11 | + User((User)) --> |"Run script<br>(e.g., pydantic_mcp.py)"| Agent |
| 12 | +
|
| 13 | + subgraph "Agent Frameworks" |
| 14 | + Agent[Agent] |
| 15 | + ADK["Google ADK<br>(adk_mcp.py)"] |
| 16 | + LG["LangGraph<br>(langgraph_mcp.py)"] |
| 17 | + OAI["OpenAI Agents<br>(oai-agent_mcp.py)"] |
| 18 | + PYD["Pydantic-AI<br>(pydantic_mcp.py)"] |
| 19 | + |
| 20 | + Agent --> ADK |
| 21 | + Agent --> LG |
| 22 | + Agent --> OAI |
| 23 | + Agent --> PYD |
| 24 | + end |
| 25 | +
|
| 26 | + subgraph "Python MCP Server" |
| 27 | + MCP["Model Context Protocol Server<br>(run_server.py)"] |
| 28 | + Tools["Tools<br>- add(a, b)<br>- get_current_time()"] |
| 29 | + Resources["Resources<br>- greeting://{name}"] |
| 30 | + MCP --- Tools |
| 31 | + MCP --- Resources |
| 32 | + end |
| 33 | +
|
| 34 | + subgraph "LLM Providers" |
| 35 | + OAI_LLM["OpenAI Models"] |
| 36 | + GEM["Google Gemini Models"] |
| 37 | + OTHER["Other LLM Providers..."] |
| 38 | + end |
| 39 | + |
| 40 | + Logfire[("Logfire<br>Tracing")] |
| 41 | + |
| 42 | + ADK --> MCP |
| 43 | + LG --> MCP |
| 44 | + OAI --> MCP |
| 45 | + PYD --> MCP |
| 46 | + |
| 47 | + MCP --> OAI_LLM |
| 48 | + MCP --> GEM |
| 49 | + MCP --> OTHER |
| 50 | + |
| 51 | + ADK --> Logfire |
| 52 | + LG --> Logfire |
| 53 | + OAI --> Logfire |
| 54 | + PYD --> Logfire |
| 55 | + |
| 56 | + LLM_Response[("Response")] --> User |
| 57 | + OAI_LLM --> LLM_Response |
| 58 | + GEM --> LLM_Response |
| 59 | + OTHER --> LLM_Response |
| 60 | +
|
| 61 | + style MCP fill:#f9f,stroke:#333,stroke-width:2px |
| 62 | + style User fill:#bbf,stroke:#338,stroke-width:2px |
| 63 | + style Logfire fill:#bfb,stroke:#383,stroke-width:2px |
| 64 | + style LLM_Response fill:#fbb,stroke:#833,stroke-width:2px |
| 65 | +``` |
| 66 | + |
| 67 | +The diagram illustrates how MCP serves as a standardised interface between different agent frameworks and LLM providers.The flow shows how users interact with the system by running a specific agent script, which then leverages MCP to communicate with LLM providers, while Logfire provides tracing and observability. |
| 68 | + |
| 69 | +### Google Agent Development Kit (ADK) |
| 70 | + |
| 71 | +**File:** `adk_mcp.py` |
| 72 | + |
| 73 | +This example demonstrates how to use MCP with Google's Agent Development Kit (ADK). |
| 74 | + |
| 75 | +```bash |
| 76 | +uv run agents_mcp_usage/basic_mcp/basic_mcp_use/adk_mcp.py |
| 77 | +``` |
| 78 | + |
| 79 | +Key features: |
| 80 | +- Uses `MCPToolset` for connecting to the MCP server |
| 81 | +- Configures a Gemini model using ADK's `LlmAgent` |
| 82 | +- Sets up session handling and runner for agent execution |
| 83 | +- Includes Logfire instrumentation for tracing |
| 84 | + |
| 85 | +### LangGraph |
| 86 | + |
| 87 | +**File:** `langgraph_mcp.py` |
| 88 | + |
| 89 | +This example demonstrates how to use MCP with LangGraph agents. |
| 90 | + |
| 91 | +```bash |
| 92 | +uv run agents_mcp_usage/basic_mcp/basic_mcp_use/langgraph_mcp.py |
| 93 | +``` |
| 94 | + |
| 95 | +Key features: |
| 96 | +- Uses LangChain MCP adapters to load tools |
| 97 | +- Creates a ReAct agent with LangGraph |
| 98 | +- Demonstrates stdio-based client connection to MCP server |
| 99 | +- Uses Gemini model for agent reasoning |
| 100 | + |
| 101 | +### OpenAI Agents |
| 102 | + |
| 103 | +**File:** `oai-agent_mcp.py` |
| 104 | + |
| 105 | +This example demonstrates how to use MCP with OpenAI's Agents package. |
| 106 | + |
| 107 | +```bash |
| 108 | +uv run agents_mcp_usage/basic_mcp/basic_mcp_use/oai-agent_mcp.py |
| 109 | +``` |
| 110 | + |
| 111 | +Key features: |
| 112 | +- Uses OpenAI's Agent and Runner classes |
| 113 | +- Connects to MCP server through MCPServerStdio |
| 114 | +- Uses OpenAI's o4-mini model |
| 115 | +- Includes Logfire instrumentation for both MCP and OpenAI Agents |
| 116 | + |
| 117 | +### Pydantic-AI |
| 118 | + |
| 119 | +**File:** `pydantic_mcp.py` |
| 120 | + |
| 121 | +This example demonstrates how to use MCP with the Pydantic-AI agent framework. |
| 122 | + |
| 123 | +```bash |
| 124 | +uv run agents_mcp_usage/basic_mcp/basic_mcp_use/pydantic_mcp.py |
| 125 | +``` |
| 126 | + |
| 127 | +Key features: |
| 128 | +- Uses the simplified Pydantic-AI Agent interface |
| 129 | +- Configures MCPServerStdio for MCP communication |
| 130 | +- Employs context manager for server lifecycle management |
| 131 | +- Includes comprehensive instrumentation for both MCP and Pydantic-AI |
| 132 | + |
| 133 | + |
| 134 | +## Understanding the Examples |
| 135 | + |
| 136 | +Each example follows a similar pattern: |
| 137 | + |
| 138 | +1. **Environment Setup**: Loading environment variables and configuring logging |
| 139 | +2. **Server Connection**: Establishing a connection to the local MCP server |
| 140 | +3. **Agent Configuration**: Setting up an agent with the appropriate model |
| 141 | +4. **Execution**: Running the agent with a query and handling the response |
| 142 | + |
| 143 | +The examples are designed to be as similar as possible, allowing you to compare how different frameworks approach MCP integration. |
| 144 | + |
| 145 | +## MCP Server |
| 146 | + |
| 147 | +All examples connect to the same MCP server defined in `run_server.py` at the project root. This server provides: |
| 148 | + |
| 149 | +- An addition tool (`add(a, b)`) |
| 150 | +- A time tool (`get_current_time()`) |
| 151 | +- A dynamic greeting resource (`greeting://{name}`) |
| 152 | + |
| 153 | +You can modify the MCP server to add your own tools and resources for experimentation. |
0 commit comments