-
Notifications
You must be signed in to change notification settings - Fork 135
MCP Integration
Python A2A provides first-class support for the Model Context Protocol (MCP), enabling agents to access external tools and data sources through a standardized interface.
The Model Context Protocol (MCP) is an open protocol for allowing AI models to request access to external systems and capabilities. It enables:
- Tools: Access to external functions like calculators, databases, or API calls
- Data Sources: Access to real-time information from the web, databases, or APIs
- Interactive UIs: Two-way communication with user interfaces
- Standardization: Common interface across different AI models and providers
Python A2A provides comprehensive MCP support with these components:
The MCPClient allows agents to connect to MCP servers and access their tools:
from python_a2a.mcp import MCPClient
# Create a client to an MCP server
client = MCPClient("http://localhost:8000")
# List available tools
tools = client.list_tools()
print(f"Available tools: {[tool.name for tool in tools]}")
# Call a tool
result = client.call_tool("calculator.add", {"a": 5, "b": 3})
print(f"Result: {result}") # Result: 8FastMCP provides a quick way to implement MCP servers with custom tools:
from python_a2a.mcp import FastMCP
from python_a2a import run_server
# Create an MCP server
calculator = FastMCP(name="Calculator MCP")
# Define tools
@calculator.tool()
def add(a: float, b: float) -> float:
"""Add two numbers together."""
return a + b
@calculator.tool()
def subtract(a: float, b: float) -> float:
"""Subtract b from a."""
return a - b
@calculator.tool()
def multiply(a: float, b: float) -> float:
"""Multiply two numbers."""
return a * b
@calculator.tool()
def divide(a: float, b: float) -> float:
"""Divide a by b."""
if b == 0:
raise ValueError("Cannot divide by zero")
return a / b
# Run the server
if __name__ == "__main__":
run_server(calculator, port=8000)MCPAgent creates an A2A agent that can leverage MCP tools:
from python_a2a.mcp import MCPAgent
from python_a2a import run_server
# Create an agent that uses MCP tools
agent = MCPAgent(
name="Math Assistant",
description="An assistant that can perform calculations",
mcp_server_url="http://localhost:8000"
)
# Run the agent
if __name__ == "__main__":
run_server(agent, port=5000)# Create the MCP server (calculator_mcp.py)
from python_a2a.mcp import FastMCP
from python_a2a import run_server
calculator = FastMCP(name="Calculator MCP")
@calculator.tool()
def add(a: float, b: float) -> float:
"""Add two numbers together."""
return a + b
@calculator.tool()
def subtract(a: float, b: float) -> float:
"""Subtract b from a."""
return a - b
if __name__ == "__main__":
run_server(calculator, port=8000)# Create an MCP client (calculator_client.py)
from python_a2a.mcp import MCPClient
client = MCPClient("http://localhost:8000")
# Call the add tool
result = client.call_tool("add", {"a": 10, "b": 5})
print(f"10 + 5 = {result}")
# Call the subtract tool
result = client.call_tool("subtract", {"a": 10, "b": 5})
print(f"10 - 5 = {result}")# Create the MCP server (web_tools_mcp.py)
from python_a2a.mcp import FastMCP
from python_a2a import run_server
import requests
web_tools = FastMCP(name="Web Tools")
@web_tools.tool()
def search_web(query: str) -> list:
"""Search the web for information."""
# This is a simplified example
# In a real application, you would use a search API
return [
{"title": f"Result for {query} #1", "url": f"https://example.com/1?q={query}"},
{"title": f"Result for {query} #2", "url": f"https://example.com/2?q={query}"}
]
@web_tools.tool()
def get_weather(location: str) -> dict:
"""Get weather information for a location."""
# This is a simplified example
# In a real application, you would use a weather API
return {
"location": location,
"temperature": 75,
"condition": "Sunny",
"humidity": 45
}
if __name__ == "__main__":
run_server(web_tools, port=8000)# Create an MCP-enabled agent (web_assistant.py)
from python_a2a.mcp import MCPAgent
from python_a2a import run_server
# Create an agent that uses the web tools
agent = MCPAgent(
name="Web Assistant",
description="An assistant that can search the web and get weather information",
mcp_server_url="http://localhost:8000"
)
if __name__ == "__main__":
run_server(agent, port=5000)# Create a client to use the agent (web_assistant_client.py)
from python_a2a import HTTPClient
client = HTTPClient("http://localhost:5000")
# Ask a question that requires web search
response = client.send_message("Find information about machine learning")
print(response.content)
# Ask about the weather
response = client.send_message("What's the weather in New York?")
print(response.content)Define rich tool schemas with type information:
from python_a2a.mcp import FastMCP, ToolSchema, ParameterSchema
calculator = FastMCP(name="Calculator MCP")
# Define a tool with explicit schema
calculator.add_tool(
name="power",
function=lambda args: args["base"] ** args["exponent"],
schema=ToolSchema(
description="Calculate base raised to an exponent",
parameters={
"base": ParameterSchema(
type="number",
description="The base number"
),
"exponent": ParameterSchema(
type="number",
description="The exponent to raise the base to"
)
},
return_type="number"
)
)Create a proxy server for MCP tools:
from python_a2a.mcp import MCPProxy, FastMCP
from python_a2a import run_server
# Create an MCP server with tools
calculator = FastMCP(name="Calculator MCP")
@calculator.tool()
def add(a: float, b: float) -> float:
"""Add two numbers together."""
return a + b
# Create a proxy for the calculator
proxy = MCPProxy(
name="Calculator Proxy",
mcp_server_url="http://localhost:8000"
)
# Run the proxy on a different port
if __name__ == "__main__":
# Run the original MCP server
import threading
threading.Thread(
target=run_server,
args=(calculator,),
kwargs={"port": 8000},
daemon=True
).start()
# Run the proxy server
run_server(proxy, port=8001)Create an MCP agent that uses an LLM to decide which tools to use:
from python_a2a.mcp import MCPAgent
from python_a2a.client.llm.openai import OpenAILLMClient
from python_a2a import run_server
import os
# Create an agent that uses OpenAI to decide which tools to use
agent = MCPAgent(
name="Smart Assistant",
description="An assistant that uses AI to decide which tools to use",
mcp_server_url="http://localhost:8000",
llm_client=OpenAILLMClient(
api_key=os.environ.get("OPENAI_API_KEY"),
model="gpt-4"
)
)
if __name__ == "__main__":
run_server(agent, port=5000)-
Tool Design:
- Keep tools simple and focused on a single task
- Provide clear descriptions and parameter information
- Use appropriate return types
- Handle errors gracefully
-
Tool Organization:
- Group related tools in the same MCP server
- Use namespaces for clarity
- Provide discovery metadata
-
Performance:
- Keep tool execution time reasonable
- Implement caching where appropriate
- Consider asynchronous execution for long-running operations
-
Security:
- Validate inputs carefully
- Implement appropriate authentication
- Limit access to sensitive operations
- Sanitize outputs before returning them
By following these guidelines, you can create robust and secure MCP integrations that enhance your A2A agents with powerful external capabilities.