Skip to content

Commit b60efe7

Browse files
committed
feature: Native MCP support
1 parent 2631d9b commit b60efe7

24 files changed

+4821
-238
lines changed

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -39,7 +39,7 @@ We built Pydantic AI with one simple aim: to bring that FastAPI feeling to GenAI
3939
[Pydantic Validation](https://docs.pydantic.dev/latest/) is the validation layer of the OpenAI SDK, the Google ADK, the Anthropic SDK, LangChain, LlamaIndex, AutoGPT, Transformers, CrewAI, Instructor and many more. _Why use the derivative when you can go straight to the source?_ :smiley:
4040

4141
2. **Model-agnostic**:
42-
Supports virtually every [model](https://ai.pydantic.dev/models/overview) and provider: OpenAI, Anthropic, Gemini, DeepSeek, Grok, Cohere, Mistral, and Perplexity; Azure AI Foundry, Amazon Bedrock, Google Vertex AI, Ollama, LiteLLM, Groq, OpenRouter, Together AI, Fireworks AI, Cerebras, Hugging Face, GitHub, Heroku, Vercel, Nebius. If your favorite model or provider is not listed, you can easily implement a [custom model](https://ai.pydantic.dev/models/overview#custom-models).
42+
Supports virtually every [model](https://ai.pydantic.dev/models/overview) and provider: OpenAI, Anthropic, Gemini, DeepSeek, Grok, Cohere, Mistral, and Perplexity; Azure AI Foundry, Amazon Bedrock, Google Vertex AI, Ollama, LiteLLM, Groq, OpenRouter, Together AI, Fireworks AI, Cerebras, Hugging Face, GitHub, Heroku, Vercel. If your favorite model or provider is not listed, you can easily implement a [custom model](https://ai.pydantic.dev/models/overview#custom-models).
4343

4444
3. **Seamless Observability**:
4545
Tightly [integrates](https://ai.pydantic.dev/logfire) with [Pydantic Logfire](https://pydantic.dev/logfire), our general-purpose OpenTelemetry observability platform, for real-time debugging, evals-based performance monitoring, and behavior, tracing, and cost tracking. If you already have an observability platform that supports OTel, you can [use that too](https://ai.pydantic.dev/logfire#alternative-observability-backends).

docs/api/providers.md

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -41,5 +41,3 @@
4141
::: pydantic_ai.providers.ollama.OllamaProvider
4242

4343
::: pydantic_ai.providers.litellm.LiteLLMProvider
44-
45-
::: pydantic_ai.providers.nebius.NebiusProvider

docs/builtin-tools.md

Lines changed: 116 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -11,6 +11,7 @@ Pydantic AI supports the following built-in tools:
1111
- **[`ImageGenerationTool`][pydantic_ai.builtin_tools.ImageGenerationTool]**: Enables agents to generate images
1212
- **[`UrlContextTool`][pydantic_ai.builtin_tools.UrlContextTool]**: Enables agents to pull URL contents into their context
1313
- **[`MemoryTool`][pydantic_ai.builtin_tools.MemoryTool]**: Enables agents to use memory
14+
- **[`MCPServerTool`][pydantic_ai.builtin_tools.MCPServerTool]**: Enables agents to pass MCP server configuration in context
1415

1516
These tools are passed to the agent via the `builtin_tools` parameter and are executed by the model provider's infrastructure.
1617

@@ -419,6 +420,121 @@ print(result.output)
419420

420421
_(This example is complete, it can be run "as is")_
421422

423+
## MCP Server Tool
424+
425+
The [`MCPServerTool`][pydantic_ai.builtin_tools.MCPServerTool] allows your agent to pass MCP configurations in context,
426+
so that the agent can offload MCP calls and parsing to the provider.
427+
428+
This tool is useful for models that support passing MCP servers as tools in parameters, so the model handles calls to remote servers by itself.
429+
430+
However, a vast majority of models do not support this feature, in which case can use Pydantic AI's agent-side [MCP support](mcp/client.md).
431+
432+
### Provider Support
433+
434+
| Provider | Supported | Notes |
435+
|----------|-----------|-----------------------|
436+
| OpenAI Responses || Full feature support |
437+
| Anthropic || Full feature support |
438+
| Google || Not supported |
439+
| Groq || Not supported |
440+
| OpenAI Chat Completions || Not supported |
441+
| Bedrock || Not supported |
442+
| Mistral || Not supported |
443+
| Cohere || Not supported |
444+
| HuggingFace || Not supported |
445+
446+
### Usage
447+
448+
```py {title="mcp_server_anthropic.py"}
449+
import os
450+
451+
from pydantic_ai import Agent, MCPServerTool
452+
453+
agent = Agent(
454+
'anthropic:claude-sonnet-4-0',
455+
builtin_tools=[
456+
MCPServerTool(
457+
id='your-mcp-server',
458+
url='https://api.githubcopilot.com/mcp/',
459+
authorization_token=os.getenv('GITHUB_ACCESS_TOKEN', 'mock-access-token'),
460+
allowed_tools=['search_repositories', 'list_commits'],
461+
)
462+
]
463+
)
464+
465+
result = agent.run_sync('Give me some examples of my products.')
466+
print(result.output)
467+
#> Here are some examples of my data: Pen, Paper, Pencil.
468+
```
469+
470+
_(This example is complete, it can be run "as is")_
471+
472+
With OpenAI, you must use their responses API to access the MCP server tool.
473+
474+
```py {title="mcp_server_openai.py"}
475+
import os
476+
477+
from pydantic_ai import Agent, MCPServerTool
478+
479+
agent = Agent(
480+
'openai-responses:gpt-4o',
481+
builtin_tools=[
482+
MCPServerTool(
483+
id='your-mcp-server',
484+
url='https://api.githubcopilot.com/mcp/',
485+
authorization_token=os.getenv('GITHUB_ACCESS_TOKEN', 'mock-access-token'),
486+
allowed_tools=['search_repositories', 'list_commits'],
487+
)
488+
]
489+
)
490+
491+
result = agent.run_sync('Give me some examples of my products.')
492+
print(result.output)
493+
#> Here are some examples of my data: Pen, Paper, Pencil.
494+
```
495+
496+
_(This example is complete, it can be run "as is")_
497+
498+
### Configuration Options
499+
500+
The `MCPServerTool` supports several configuration parameters:
501+
502+
```py {title="mcp_server_configured.py"}
503+
import os
504+
505+
from pydantic_ai import Agent, MCPServerTool
506+
507+
agent = Agent(
508+
'openai-responses:gpt-4o',
509+
builtin_tools=[
510+
MCPServerTool(
511+
id='your-mcp-server',
512+
url='https://api.githubcopilot.com/mcp/',
513+
authorization_token=os.getenv('GITHUB_ACCESS_TOKEN', 'mock-access-token'),
514+
allowed_tools=['search_repositories', 'list_commits'],
515+
description='Your MCP Server',
516+
headers={'X-CUSTOM-HEADER': 'custom-value'},
517+
provider_metadata={'connector_id': 'connector_googlecalendar'},
518+
)
519+
]
520+
)
521+
522+
result = agent.run_sync('Give me some examples of my products.')
523+
print(result.output)
524+
#> Here are some examples of my data: Pen, Paper, Pencil.
525+
```
526+
527+
_(This example is complete, it can be run "as is")_
528+
529+
#### Provider Support
530+
531+
| Parameter | OpenAI | Anthropic | Notes |
532+
|---------------------|--------|-----------|-------------------------------------------------------------------------------------------------------------|
533+
| `url` ||| Optional for OpenAI (can use either `url` or `provider_metadata.connector_id`). Required for Anthropic) |
534+
| `allowed_tools` ||| ----------- |
535+
| `provider_metadata` ||| Optional for OpenAI (can use either `url` or `provider_metadata.connector_id`, not supported for Anthropic) |
536+
| `headers` ||| ----------- |
537+
422538
## API Reference
423539

424540
For complete API documentation, see the [API Reference](api/builtin_tools.md).

docs/index.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ We built Pydantic AI with one simple aim: to bring that FastAPI feeling to GenAI
1414
[Pydantic Validation](https://docs.pydantic.dev/latest/) is the validation layer of the OpenAI SDK, the Google ADK, the Anthropic SDK, LangChain, LlamaIndex, AutoGPT, Transformers, CrewAI, Instructor and many more. _Why use the derivative when you can go straight to the source?_ :smiley:
1515

1616
2. **Model-agnostic**:
17-
Supports virtually every [model](models/overview.md) and provider: OpenAI, Anthropic, Gemini, DeepSeek, Grok, Cohere, Mistral, and Perplexity; Azure AI Foundry, Amazon Bedrock, Google Vertex AI, Ollama, LiteLLM, Groq, OpenRouter, Together AI, Fireworks AI, Cerebras, Hugging Face, GitHub, Heroku, Vercel, Nebius. If your favorite model or provider is not listed, you can easily implement a [custom model](models/overview.md#custom-models).
17+
Supports virtually every [model](models/overview.md) and provider: OpenAI, Anthropic, Gemini, DeepSeek, Grok, Cohere, Mistral, and Perplexity; Azure AI Foundry, Amazon Bedrock, Google Vertex AI, Ollama, LiteLLM, Groq, OpenRouter, Together AI, Fireworks AI, Cerebras, Hugging Face, GitHub, Heroku, Vercel. If your favorite model or provider is not listed, you can easily implement a [custom model](models/overview.md#custom-models).
1818

1919
3. **Seamless Observability**:
2020
Tightly [integrates](logfire.md) with [Pydantic Logfire](https://pydantic.dev/logfire), our general-purpose OpenTelemetry observability platform, for real-time debugging, evals-based performance monitoring, and behavior, tracing, and cost tracking. If you already have an observability platform that supports OTel, you can [use that too](logfire.md#alternative-observability-backends).

docs/models/openai.md

Lines changed: 0 additions & 32 deletions
Original file line numberDiff line numberDiff line change
@@ -608,35 +608,3 @@ print(result.output)
608608
#> The capital of France is Paris.
609609
...
610610
```
611-
612-
### Nebius AI Studio
613-
614-
Go to [Nebius AI Studio](https://studio.nebius.com/) and create an API key.
615-
616-
Once you've set the `NEBIUS_API_KEY` environment variable, you can run the following:
617-
618-
```python
619-
from pydantic_ai import Agent
620-
621-
agent = Agent('nebius:Qwen/Qwen3-32B-fast')
622-
result = agent.run_sync('What is the capital of France?')
623-
print(result.output)
624-
#> The capital of France is Paris.
625-
```
626-
627-
If you need to configure the provider, you can use the [`NebiusProvider`][pydantic_ai.providers.nebius.NebiusProvider] class:
628-
629-
```python
630-
from pydantic_ai import Agent
631-
from pydantic_ai.models.openai import OpenAIChatModel
632-
from pydantic_ai.providers.nebius import NebiusProvider
633-
634-
model = OpenAIChatModel(
635-
'Qwen/Qwen3-32B-fast',
636-
provider=NebiusProvider(api_key='your-nebius-api-key'),
637-
)
638-
agent = Agent(model)
639-
result = agent.run_sync('What is the capital of France?')
640-
print(result.output)
641-
#> The capital of France is Paris.
642-
```

docs/models/overview.md

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,6 @@ In addition, many providers are compatible with the OpenAI API, and can be used
2828
- [GitHub Models](openai.md#github-models)
2929
- [Cerebras](openai.md#cerebras)
3030
- [LiteLLM](openai.md#litellm)
31-
- [Nebius AI Studio](openai.md#nebius-ai-studio)
3231

3332
Pydantic AI also comes with [`TestModel`](../api/models/test.md) and [`FunctionModel`](../api/models/function.md)
3433
for testing and development.

pydantic_ai_slim/pydantic_ai/__init__.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -12,6 +12,7 @@
1212
from .builtin_tools import (
1313
CodeExecutionTool,
1414
ImageGenerationTool,
15+
MCPServerTool,
1516
MemoryTool,
1617
UrlContextTool,
1718
WebSearchTool,
@@ -211,6 +212,7 @@
211212
'CodeExecutionTool',
212213
'ImageGenerationTool',
213214
'MemoryTool',
215+
'MCPServerTool',
214216
# output
215217
'ToolOutput',
216218
'NativeOutput',

pydantic_ai_slim/pydantic_ai/_parts_manager.py

Lines changed: 0 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -312,7 +312,6 @@ def handle_tool_call_part(
312312
tool_name: str,
313313
args: str | dict[str, Any] | None,
314314
tool_call_id: str | None = None,
315-
id: str | None = None,
316315
) -> ModelResponseStreamEvent:
317316
"""Immediately create or fully-overwrite a ToolCallPart with the given information.
318317
@@ -324,7 +323,6 @@ def handle_tool_call_part(
324323
tool_name: The name of the tool being invoked.
325324
args: The arguments for the tool call, either as a string, a dictionary, or None.
326325
tool_call_id: An optional string identifier for this tool call.
327-
id: An optional identifier for this tool call part.
328326
329327
Returns:
330328
ModelResponseStreamEvent: A `PartStartEvent` indicating that a new tool call part
@@ -334,7 +332,6 @@ def handle_tool_call_part(
334332
tool_name=tool_name,
335333
args=args,
336334
tool_call_id=tool_call_id or _generate_tool_call_id(),
337-
id=id,
338335
)
339336
if vendor_part_id is None:
340337
# vendor_part_id is None, so we unconditionally append a new ToolCallPart to the end of the list

pydantic_ai_slim/pydantic_ai/builtin_tools.py

Lines changed: 62 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22

33
from abc import ABC
44
from dataclasses import dataclass
5-
from typing import TYPE_CHECKING, Literal
5+
from typing import TYPE_CHECKING, Any, Literal, Dict
66

77
from typing_extensions import TypedDict
88

@@ -17,6 +17,7 @@
1717
'UrlContextTool',
1818
'ImageGenerationTool',
1919
'MemoryTool',
20+
'MCPServerTool',
2021
)
2122

2223

@@ -237,3 +238,63 @@ class MemoryTool(AbstractBuiltinTool):
237238

238239
kind: str = 'memory'
239240
"""The kind of tool."""
241+
242+
243+
@dataclass(kw_only=True)
244+
class MCPServerTool(AbstractBuiltinTool):
245+
"""A builtin tool that allows your agent to use MCP servers.
246+
247+
Supported by:
248+
249+
* OpenAI Responses
250+
* Anthropic
251+
"""
252+
253+
LIST_TOOLS_KIND: str = 'mcp_list_tools'
254+
CALL_KIND: str = 'mcp_call'
255+
256+
kind: str = 'mcp_server'
257+
258+
id: str
259+
"""The id of the MCP server to use."""
260+
261+
authorization_token: str
262+
"""Authorization header to use when making requests to the MCP server."""
263+
264+
url: str | None = None
265+
"""The URL of the MCP server to use.
266+
267+
For OpenAI Responses, one of `url` or `connector_id` must be provided.
268+
"""
269+
270+
description: str | None = None
271+
"""A description of the MCP server."""
272+
273+
allowed_tools: list[str] | None = None
274+
"""A list of tools that the MCP server can use.
275+
276+
Supported by:
277+
278+
* OpenAI Responses
279+
* Anthropic
280+
"""
281+
282+
headers: dict[str, str] | None = None
283+
"""Optional HTTP headers to send to the MCP server.
284+
285+
Use for authentication or other purposes.
286+
287+
Supported by:
288+
289+
* OpenAI Responses
290+
"""
291+
292+
provider_metadata: dict[str, Any] | None = None
293+
"""Extra data to send to the model.
294+
295+
Supported by:
296+
297+
Supported by:
298+
299+
* OpenAI Responses
300+
"""

pydantic_ai_slim/pydantic_ai/messages.py

Lines changed: 0 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1052,13 +1052,6 @@ class BaseToolCallPart:
10521052
In case the tool call id is not provided by the model, Pydantic AI will generate a random one.
10531053
"""
10541054

1055-
_: KW_ONLY
1056-
1057-
id: str | None = None
1058-
"""An optional identifier of the tool call part, separate from the tool call ID.
1059-
1060-
This is used by some APIs like OpenAI Responses."""
1061-
10621055
def args_as_dict(self) -> dict[str, Any]:
10631056
"""Return the arguments as a Python dictionary.
10641057

0 commit comments

Comments
 (0)