Skip to content

Commit d4a85e6

Browse files
ArtuiDouweMcertainly-param
authored
Support OpenAI and Anthropic native MCP support via MCPServerTool builtin tool (#3101)
Co-authored-by: Douwe Maan <[email protected]> Co-authored-by: Param <[email protected]>
1 parent ecd9085 commit d4a85e6

15 files changed

+3716
-25
lines changed

docs/builtin-tools.md

Lines changed: 145 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,6 +11,7 @@ Pydantic AI supports the following built-in tools:
1111
- **[`ImageGenerationTool`][pydantic_ai.builtin_tools.ImageGenerationTool]**: Enables agents to generate images
1212
- **[`UrlContextTool`][pydantic_ai.builtin_tools.UrlContextTool]**: Enables agents to pull URL contents into their context
1313
- **[`MemoryTool`][pydantic_ai.builtin_tools.MemoryTool]**: Enables agents to use memory
14+
- **[`MCPServerTool`][pydantic_ai.builtin_tools.MCPServerTool]**: Enables agents to use remote MCP servers with communication handled by the model provider
1415

1516
These tools are passed to the agent via the `builtin_tools` parameter and are executed by the model provider's infrastructure.
1617

@@ -52,7 +53,7 @@ print(result.output)
5253

5354
_(This example is complete, it can be run "as is")_
5455

55-
With OpenAI, you must use their responses API to access the web search tool.
56+
With OpenAI, you must use their Responses API to access the web search tool.
5657

5758
```py {title="web_search_openai.py"}
5859
from pydantic_ai import Agent, WebSearchTool
@@ -419,6 +420,149 @@ print(result.output)
419420

420421
_(This example is complete, it can be run "as is")_
421422

423+
## MCP Server Tool
424+
425+
The [`MCPServerTool`][pydantic_ai.builtin_tools.MCPServerTool] allows your agent to use remote MCP servers with communication handled by the model provider.
426+
427+
This requires the MCP server to live at a public URL the provider can reach and does not support many of the advanced features of Pydantic AI's agent-side [MCP support](mcp/client.md),
428+
but can result in optimized context use and caching, and faster performance due to the lack of a round-trip back to Pydantic AI.
429+
430+
### Provider Support
431+
432+
| Provider | Supported | Notes |
433+
|----------|-----------|-----------------------|
434+
| OpenAI Responses || Full feature support. [Connectors](https://platform.openai.com/docs/guides/tools-connectors-mcp#connectors) can be used by specifying a special `x-openai-connector:<connector_id>` URL. |
435+
| Anthropic || Full feature support |
436+
| Google || Not supported |
437+
| Groq || Not supported |
438+
| OpenAI Chat Completions || Not supported |
439+
| Bedrock || Not supported |
440+
| Mistral || Not supported |
441+
| Cohere || Not supported |
442+
| HuggingFace || Not supported |
443+
444+
### Usage
445+
446+
```py {title="mcp_server_anthropic.py"}
447+
from pydantic_ai import Agent, MCPServerTool
448+
449+
agent = Agent(
450+
'anthropic:claude-sonnet-4-5',
451+
builtin_tools=[
452+
MCPServerTool(
453+
id='deepwiki',
454+
url='https://mcp.deepwiki.com/mcp', # (1)
455+
)
456+
]
457+
)
458+
459+
result = agent.run_sync('Tell me about the pydantic/pydantic-ai repo.')
460+
print(result.output)
461+
"""
462+
The pydantic/pydantic-ai repo is a Python agent framework for building Generative AI applications.
463+
"""
464+
```
465+
466+
1. The [DeepWiki MCP server](https://docs.devin.ai/work-with-devin/deepwiki-mcp) does not require authorization.
467+
468+
_(This example is complete, it can be run "as is")_
469+
470+
With OpenAI, you must use their Responses API to access the MCP server tool:
471+
472+
```py {title="mcp_server_openai.py"}
473+
from pydantic_ai import Agent, MCPServerTool
474+
475+
agent = Agent(
476+
'openai-responses:gpt-5',
477+
builtin_tools=[
478+
MCPServerTool(
479+
id='deepwiki',
480+
url='https://mcp.deepwiki.com/mcp', # (1)
481+
)
482+
]
483+
)
484+
485+
result = agent.run_sync('Tell me about the pydantic/pydantic-ai repo.')
486+
print(result.output)
487+
"""
488+
The pydantic/pydantic-ai repo is a Python agent framework for building Generative AI applications.
489+
"""
490+
```
491+
492+
1. The [DeepWiki MCP server](https://docs.devin.ai/work-with-devin/deepwiki-mcp) does not require authorization.
493+
494+
_(This example is complete, it can be run "as is")_
495+
496+
### Configuration Options
497+
498+
The `MCPServerTool` supports several configuration parameters for custom MCP servers:
499+
500+
```py {title="mcp_server_configured_url.py"}
501+
import os
502+
503+
from pydantic_ai import Agent, MCPServerTool
504+
505+
agent = Agent(
506+
'openai-responses:gpt-5',
507+
builtin_tools=[
508+
MCPServerTool(
509+
id='github',
510+
url='https://api.githubcopilot.com/mcp/',
511+
authorization_token=os.getenv('GITHUB_ACCESS_TOKEN', 'mock-access-token'), # (1)
512+
allowed_tools=['search_repositories', 'list_commits'],
513+
description='GitHub MCP server',
514+
headers={'X-Custom-Header': 'custom-value'},
515+
)
516+
]
517+
)
518+
519+
result = agent.run_sync('Tell me about the pydantic/pydantic-ai repo.')
520+
print(result.output)
521+
"""
522+
The pydantic/pydantic-ai repo is a Python agent framework for building Generative AI applications.
523+
"""
524+
```
525+
526+
1. The [GitHub MCP server](https://github.com/github/github-mcp-server) requires an authorization token.
527+
528+
For OpenAI Responses, you can use a [connector](https://platform.openai.com/docs/guides/tools-connectors-mcp#connectors) by specifying a special `x-openai-connector:` URL:
529+
530+
_(This example is complete, it can be run "as is")_
531+
532+
```py {title="mcp_server_configured_connector_id.py"}
533+
import os
534+
535+
from pydantic_ai import Agent, MCPServerTool
536+
537+
agent = Agent(
538+
'openai-responses:gpt-5',
539+
builtin_tools=[
540+
MCPServerTool(
541+
id='google-calendar',
542+
url='x-openai-connector:connector_googlecalendar',
543+
authorization_token=os.getenv('GOOGLE_API_KEY', 'mock-api-key'), # (1)
544+
)
545+
]
546+
)
547+
548+
result = agent.run_sync('What do I have on my calendar today?')
549+
print(result.output)
550+
#> You're going to spend all day playing with Pydantic AI.
551+
```
552+
553+
1. OpenAI's Google Calendar connector requires an [authorization token](https://platform.openai.com/docs/guides/tools-connectors-mcp#authorizing-a-connector).
554+
555+
_(This example is complete, it can be run "as is")_
556+
557+
#### Provider Support
558+
559+
| Parameter | OpenAI | Anthropic |
560+
|-----------------------|--------|-----------|
561+
| `authorization_token` |||
562+
| `allowed_tools` |||
563+
| `description` |||
564+
| `headers` |||
565+
422566
## API Reference
423567

424568
For complete API documentation, see the [API Reference](api/builtin_tools.md).

docs/mcp/overview.md

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,9 @@
22

33
Pydantic AI supports [Model Context Protocol (MCP)](https://modelcontextprotocol.io) in two ways:
44

5-
1. [Agents](../agents.md) act as an MCP Client, connecting to MCP servers to use their tools, [learn more …](client.md)
5+
1. [Agents](../agents.md) can connect to MCP servers and user their tools
6+
1. Pydantic AI can act as an MCP client and connect directly to local and remote MCP servers, [learn more …](client.md)
7+
2. Some model providers can themselves connect to remote MCP servers, [learn more …](../builtin-tools.md#mcp-server-tool)
68
2. Agents can be used within MCP servers, [learn more …](server.md)
79

810
## What is MCP?

docs/models/google.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -140,7 +140,7 @@ agent = Agent(model)
140140

141141
#### Customizing Location or Project
142142

143-
You can specify the location and/or projectwhen using Vertex AI:
143+
You can specify the location and/or project when using Vertex AI:
144144

145145
```python {title="google_model_location.py" test="skip"}
146146
from pydantic_ai import Agent

pydantic_ai_slim/pydantic_ai/__init__.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -12,6 +12,7 @@
1212
from .builtin_tools import (
1313
CodeExecutionTool,
1414
ImageGenerationTool,
15+
MCPServerTool,
1516
MemoryTool,
1617
UrlContextTool,
1718
WebSearchTool,
@@ -213,6 +214,7 @@
213214
'CodeExecutionTool',
214215
'ImageGenerationTool',
215216
'MemoryTool',
217+
'MCPServerTool',
216218
# output
217219
'ToolOutput',
218220
'NativeOutput',

pydantic_ai_slim/pydantic_ai/builtin_tools.py

Lines changed: 59 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -16,6 +16,7 @@
1616
'UrlContextTool',
1717
'ImageGenerationTool',
1818
'MemoryTool',
19+
'MCPServerTool',
1920
)
2021

2122
_BUILTIN_TOOL_TYPES: dict[str, type[AbstractBuiltinTool]] = {}
@@ -263,6 +264,64 @@ class MemoryTool(AbstractBuiltinTool):
263264
"""The kind of tool."""
264265

265266

267+
@dataclass(kw_only=True)
268+
class MCPServerTool(AbstractBuiltinTool):
269+
"""A builtin tool that allows your agent to use MCP servers.
270+
271+
Supported by:
272+
273+
* OpenAI Responses
274+
* Anthropic
275+
"""
276+
277+
id: str
278+
"""The ID of the MCP server."""
279+
280+
url: str
281+
"""The URL of the MCP server to use.
282+
283+
For OpenAI Responses, it is possible to use `connector_id` by providing it as `x-openai-connector:<connector_id>`.
284+
"""
285+
286+
authorization_token: str | None = None
287+
"""Authorization header to use when making requests to the MCP server.
288+
289+
Supported by:
290+
291+
* OpenAI Responses
292+
* Anthropic
293+
"""
294+
295+
description: str | None = None
296+
"""A description of the MCP server.
297+
298+
Supported by:
299+
300+
* OpenAI Responses
301+
"""
302+
303+
allowed_tools: list[str] | None = None
304+
"""A list of tools that the MCP server can use.
305+
306+
Supported by:
307+
308+
* OpenAI Responses
309+
* Anthropic
310+
"""
311+
312+
headers: dict[str, str] | None = None
313+
"""Optional HTTP headers to send to the MCP server.
314+
315+
Use for authentication or other purposes.
316+
317+
Supported by:
318+
319+
* OpenAI Responses
320+
"""
321+
322+
kind: str = 'mcp_server'
323+
324+
266325
def _tool_discriminator(tool_data: dict[str, Any] | AbstractBuiltinTool) -> str:
267326
if isinstance(tool_data, dict):
268327
return tool_data.get('kind', AbstractBuiltinTool.kind)

0 commit comments

Comments
 (0)