Skip to content

Commit 2674863

Browse files
authored
Merge branch 'main' into patch-3323
2 parents 8267b71 + 5544e9f commit 2674863

File tree

101 files changed

+6354
-1158
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

101 files changed

+6354
-1158
lines changed

.github/workflows/ci.yml

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -157,6 +157,9 @@ jobs:
157157
env:
158158
CI: true
159159
COVERAGE_PROCESS_START: ./pyproject.toml
160+
# We only run the llama_cpp tests on the latest Python as they have been regularly failing in CI with `Fatal Python error: Illegal instruction`:
161+
# https://github.com/pydantic/pydantic-ai/actions/runs/19547773220/job/55970947389
162+
RUN_LLAMA_CPP_TESTS: ${{ matrix.python-version == '3.13' && matrix.install.name == 'all-extras' }}
160163
steps:
161164
- uses: actions/checkout@v4
162165

@@ -207,6 +210,7 @@ jobs:
207210
env:
208211
CI: true
209212
COVERAGE_PROCESS_START: ./pyproject.toml
213+
RUN_LLAMA_CPP_TESTS: false
210214
steps:
211215
- uses: actions/checkout@v4
212216

Makefile

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -43,6 +43,7 @@ typecheck-pyright:
4343
.PHONY: typecheck-mypy
4444
typecheck-mypy:
4545
uv run mypy
46+
uv run mypy typings/ --strict
4647

4748
.PHONY: typecheck
4849
typecheck: typecheck-pyright ## Run static type checking

docs/mcp/client.md

Lines changed: 61 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -365,6 +365,67 @@ async def main():
365365

366366
MCP tools can include metadata that provides additional information about the tool's characteristics, which can be useful when [filtering tools][pydantic_ai.toolsets.FilteredToolset]. The `meta`, `annotations`, and `output_schema` fields can be found on the `metadata` dict on the [`ToolDefinition`][pydantic_ai.tools.ToolDefinition] object that's passed to filter functions.
367367

368+
## Resources
369+
370+
MCP servers can provide [resources](https://modelcontextprotocol.io/docs/concepts/resources) - files, data, or content that can be accessed by the client. Resources in MCP are application-driven, with host applications determining how to incorporate context manually, based on their needs. This means they will _not_ be exposed to the LLM automatically (unless a tool returns a `ResourceLink` or `EmbeddedResource`).
371+
372+
Pydantic AI provides methods to discover and read resources from MCP servers:
373+
374+
- [`list_resources()`][pydantic_ai.mcp.MCPServer.list_resources] - List all available resources on the server
375+
- [`list_resource_templates()`][pydantic_ai.mcp.MCPServer.list_resource_templates] - List resource templates with parameter placeholders
376+
- [`read_resource(uri)`][pydantic_ai.mcp.MCPServer.read_resource] - Read the contents of a specific resource by URI
377+
378+
Resources are automatically converted: text content is returned as `str`, and binary content is returned as [`BinaryContent`][pydantic_ai.messages.BinaryContent].
379+
380+
Before consuming resources, we need to run a server that exposes some:
381+
382+
```python {title="mcp_resource_server.py"}
383+
from mcp.server.fastmcp import FastMCP
384+
385+
mcp = FastMCP('Pydantic AI MCP Server')
386+
log_level = 'unset'
387+
388+
389+
@mcp.resource('resource://user_name.txt', mime_type='text/plain')
390+
async def user_name_resource() -> str:
391+
return 'Alice'
392+
393+
394+
if __name__ == '__main__':
395+
mcp.run()
396+
```
397+
398+
Then we can create the client:
399+
400+
```python {title="mcp_resources.py", requires="mcp_resource_server.py"}
401+
import asyncio
402+
403+
from pydantic_ai.mcp import MCPServerStdio
404+
405+
406+
async def main():
407+
server = MCPServerStdio('python', args=['-m', 'mcp_resource_server'])
408+
409+
async with server:
410+
# List all available resources
411+
resources = await server.list_resources()
412+
for resource in resources:
413+
print(f' - {resource.name}: {resource.uri} ({resource.mime_type})')
414+
#> - user_name_resource: resource://user_name.txt (text/plain)
415+
416+
# Read a text resource
417+
user_name = await server.read_resource('resource://user_name.txt')
418+
print(f'Text content: {user_name}')
419+
#> Text content: Alice
420+
421+
422+
if __name__ == '__main__':
423+
asyncio.run(main())
424+
```
425+
426+
_(This example is complete, it can be run "as is")_
427+
428+
368429
## Custom TLS / SSL configuration
369430

370431
In some environments you need to tweak how HTTPS connections are established –

docs/models/openai.md

Lines changed: 1 addition & 29 deletions
Original file line numberDiff line numberDiff line change
@@ -233,7 +233,7 @@ agent = Agent(model)
233233
```
234234

235235
Various providers also have their own provider classes so that you don't need to specify the base URL yourself and you can use the standard `<PROVIDER>_API_KEY` environment variable to set the API key.
236-
When a provider has its own provider class, you can use the `Agent("<provider>:<model>")` shorthand, e.g. `Agent("deepseek:deepseek-chat")` or `Agent("openrouter:google/gemini-2.5-pro-preview")`, instead of building the `OpenAIChatModel` explicitly. Similarly, you can pass the provider name as a string to the `provider` argument on `OpenAIChatModel` instead of building instantiating the provider class explicitly.
236+
When a provider has its own provider class, you can use the `Agent("<provider>:<model>")` shorthand, e.g. `Agent("deepseek:deepseek-chat")` or `Agent("moonshotai:kimi-k2-0711-preview")`, instead of building the `OpenAIChatModel` explicitly. Similarly, you can pass the provider name as a string to the `provider` argument on `OpenAIChatModel` instead of building instantiating the provider class explicitly.
237237

238238
#### Model Profile
239239

@@ -385,34 +385,6 @@ agent = Agent(model)
385385
...
386386
```
387387

388-
### OpenRouter
389-
390-
To use [OpenRouter](https://openrouter.ai), first create an API key at [openrouter.ai/keys](https://openrouter.ai/keys).
391-
392-
You can set the `OPENROUTER_API_KEY` environment variable and use [`OpenRouterProvider`][pydantic_ai.providers.openrouter.OpenRouterProvider] by name:
393-
394-
```python
395-
from pydantic_ai import Agent
396-
397-
agent = Agent('openrouter:anthropic/claude-3.5-sonnet')
398-
...
399-
```
400-
401-
Or initialise the model and provider directly:
402-
403-
```python
404-
from pydantic_ai import Agent
405-
from pydantic_ai.models.openai import OpenAIChatModel
406-
from pydantic_ai.providers.openrouter import OpenRouterProvider
407-
408-
model = OpenAIChatModel(
409-
'anthropic/claude-3.5-sonnet',
410-
provider=OpenRouterProvider(api_key='your-openrouter-api-key'),
411-
)
412-
agent = Agent(model)
413-
...
414-
```
415-
416388
### Vercel AI Gateway
417389

418390
To use [Vercel's AI Gateway](https://vercel.com/docs/ai-gateway), first follow the [documentation](https://vercel.com/docs/ai-gateway) instructions on obtaining an API key or OIDC token.

docs/models/openrouter.md

Lines changed: 54 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,54 @@
1+
# OpenRouter
2+
3+
## Install
4+
5+
To use `OpenRouterModel`, you need to either install `pydantic-ai`, or install `pydantic-ai-slim` with the `openrouter` optional group:
6+
7+
```bash
8+
pip/uv-add "pydantic-ai-slim[openrouter]"
9+
```
10+
11+
## Configuration
12+
13+
To use [OpenRouter](https://openrouter.ai), first create an API key at [openrouter.ai/keys](https://openrouter.ai/keys).
14+
15+
You can set the `OPENROUTER_API_KEY` environment variable and use [`OpenRouterProvider`][pydantic_ai.providers.openrouter.OpenRouterProvider] by name:
16+
17+
```python
18+
from pydantic_ai import Agent
19+
20+
agent = Agent('openrouter:anthropic/claude-3.5-sonnet')
21+
...
22+
```
23+
24+
Or initialise the model and provider directly:
25+
26+
```python
27+
from pydantic_ai import Agent
28+
from pydantic_ai.models.openrouter import OpenRouterModel
29+
from pydantic_ai.providers.openrouter import OpenRouterProvider
30+
31+
model = OpenRouterModel(
32+
'anthropic/claude-3.5-sonnet',
33+
provider=OpenRouterProvider(api_key='your-openrouter-api-key'),
34+
)
35+
agent = Agent(model)
36+
...
37+
```
38+
39+
## App Attribution
40+
41+
OpenRouter has an [app attribution](https://openrouter.ai/docs/app-attribution) feature to track your application in their public ranking and analytics.
42+
43+
You can pass in an `app_url` and `app_title` when initializing the provider to enable app attribution.
44+
45+
```python
46+
from pydantic_ai.providers.openrouter import OpenRouterProvider
47+
48+
provider=OpenRouterProvider(
49+
api_key='your-openrouter-api-key',
50+
app_url='https://your-app.com',
51+
app_title='Your App',
52+
),
53+
...
54+
```

docs/models/overview.md

Lines changed: 17 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -5,32 +5,32 @@ Pydantic AI is model-agnostic and has built-in support for multiple model provid
55
* [OpenAI](openai.md)
66
* [Anthropic](anthropic.md)
77
* [Gemini](google.md) (via two different APIs: Generative Language API and VertexAI API)
8-
* [Groq](groq.md)
9-
* [Mistral](mistral.md)
10-
* [Cohere](cohere.md)
118
* [Bedrock](bedrock.md)
9+
* [Cohere](cohere.md)
10+
* [Groq](groq.md)
1211
* [Hugging Face](huggingface.md)
12+
* [Mistral](mistral.md)
13+
* [OpenRouter](openrouter.md)
1314
* [Outlines](outlines.md)
1415

1516
## OpenAI-compatible Providers
1617

1718
In addition, many providers are compatible with the OpenAI API, and can be used with `OpenAIChatModel` in Pydantic AI:
1819

20+
- [Azure AI Foundry](openai.md#azure-ai-foundry)
21+
- [Cerebras](openai.md#cerebras)
1922
- [DeepSeek](openai.md#deepseek)
20-
- [Grok (xAI)](openai.md#grok-xai)
21-
- [Ollama](openai.md#ollama)
22-
- [OpenRouter](openai.md#openrouter)
23-
- [Vercel AI Gateway](openai.md#vercel-ai-gateway)
24-
- [Perplexity](openai.md#perplexity)
2523
- [Fireworks AI](openai.md#fireworks-ai)
26-
- [Together AI](openai.md#together-ai)
27-
- [Azure AI Foundry](openai.md#azure-ai-foundry)
28-
- [Heroku](openai.md#heroku-ai)
2924
- [GitHub Models](openai.md#github-models)
30-
- [Cerebras](openai.md#cerebras)
25+
- [Grok (xAI)](openai.md#grok-xai)
26+
- [Heroku](openai.md#heroku-ai)
3127
- [LiteLLM](openai.md#litellm)
3228
- [Nebius AI Studio](openai.md#nebius-ai-studio)
29+
- [Ollama](openai.md#ollama)
3330
- [OVHcloud AI Endpoints](openai.md#ovhcloud-ai-endpoints)
31+
- [Perplexity](openai.md#perplexity)
32+
- [Together AI](openai.md#together-ai)
33+
- [Vercel AI Gateway](openai.md#vercel-ai-gateway)
3434

3535
Pydantic AI also comes with [`TestModel`](../api/models/test.md) and [`FunctionModel`](../api/models/function.md)
3636
for testing and development.
@@ -180,7 +180,7 @@ contains all the exceptions encountered during the `run` execution.
180180
=== "Python >=3.11"
181181

182182
```python {title="fallback_model_failure.py" py="3.11"}
183-
from pydantic_ai import Agent, ModelHTTPError
183+
from pydantic_ai import Agent, ModelAPIError
184184
from pydantic_ai.models.anthropic import AnthropicModel
185185
from pydantic_ai.models.fallback import FallbackModel
186186
from pydantic_ai.models.openai import OpenAIChatModel
@@ -192,7 +192,7 @@ contains all the exceptions encountered during the `run` execution.
192192
agent = Agent(fallback_model)
193193
try:
194194
response = agent.run_sync('What is the capital of France?')
195-
except* ModelHTTPError as exc_group:
195+
except* ModelAPIError as exc_group:
196196
for exc in exc_group.exceptions:
197197
print(exc)
198198
```
@@ -206,7 +206,7 @@ contains all the exceptions encountered during the `run` execution.
206206
```python {title="fallback_model_failure.py" noqa="F821" test="skip"}
207207
from exceptiongroup import catch
208208

209-
from pydantic_ai import Agent, ModelHTTPError
209+
from pydantic_ai import Agent, ModelAPIError
210210
from pydantic_ai.models.anthropic import AnthropicModel
211211
from pydantic_ai.models.fallback import FallbackModel
212212
from pydantic_ai.models.openai import OpenAIChatModel
@@ -222,10 +222,11 @@ contains all the exceptions encountered during the `run` execution.
222222
fallback_model = FallbackModel(openai_model, anthropic_model)
223223

224224
agent = Agent(fallback_model)
225-
with catch({ModelHTTPError: model_status_error_handler}):
225+
with catch({ModelAPIError: model_status_error_handler}):
226226
response = agent.run_sync('What is the capital of France?')
227227
```
228228

229229
By default, the `FallbackModel` only moves on to the next model if the current model raises a
230+
[`ModelAPIError`][pydantic_ai.exceptions.ModelAPIError], which includes
230231
[`ModelHTTPError`][pydantic_ai.exceptions.ModelHTTPError]. You can customize this behavior by
231232
passing a custom `fallback_on` argument to the `FallbackModel` constructor.

docs/ui/ag-ui.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -178,6 +178,8 @@ validate state contained in [`RunAgentInput.state`](https://docs.ag-ui.com/sdk/j
178178

179179
If the `state` field's type is a Pydantic `BaseModel` subclass, the raw state dictionary on the request is automatically validated. If not, you can validate the raw value yourself in your dependencies dataclass's `__post_init__` method.
180180

181+
If AG-UI state is provided but your dependencies do not implement [`StateHandler`][pydantic_ai.ag_ui.StateHandler], Pydantic AI will emit a warning and ignore the state. Use [`StateDeps`][pydantic_ai.ag_ui.StateDeps] or a custom [`StateHandler`][pydantic_ai.ag_ui.StateHandler] implementation to receive and validate the incoming state.
182+
181183

182184
```python {title="ag_ui_state.py"}
183185
from pydantic import BaseModel

mkdocs.yml

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -32,8 +32,9 @@ nav:
3232
- models/bedrock.md
3333
- models/cohere.md
3434
- models/groq.md
35-
- models/mistral.md
3635
- models/huggingface.md
36+
- models/mistral.md
37+
- models/openrouter.md
3738
- models/outlines.md
3839
- Tools & Toolsets:
3940
- tools.md

pydantic_ai_slim/pydantic_ai/__init__.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -24,6 +24,7 @@
2424
CallDeferred,
2525
FallbackExceptionGroup,
2626
IncompleteToolCall,
27+
ModelAPIError,
2728
ModelHTTPError,
2829
ModelRetry,
2930
UnexpectedModelBehavior,
@@ -126,6 +127,7 @@
126127
'CallDeferred',
127128
'ApprovalRequired',
128129
'ModelRetry',
130+
'ModelAPIError',
129131
'ModelHTTPError',
130132
'FallbackExceptionGroup',
131133
'IncompleteToolCall',

0 commit comments

Comments
 (0)