Skip to content

Commit 61b077b

Browse files
authored
Remove 'only on 3.10+' callouts (#2733)
1 parent 7b55b8b commit 61b077b

File tree

9 files changed

+52
-56
lines changed

9 files changed

+52
-56
lines changed

docs/ag-ui.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -119,7 +119,7 @@ This will expose the agent as an AG-UI server, and your frontend can start sendi
119119

120120
This example uses [`Agent.to_ag_ui()`][pydantic_ai.agent.AbstractAgent.to_ag_ui] to turn the agent into a stand-alone ASGI application:
121121

122-
```py {title="agent_to_ag_ui.py" py="3.10" hl_lines="4"}
122+
```py {title="agent_to_ag_ui.py" hl_lines="4"}
123123
from pydantic_ai import Agent
124124

125125
agent = Agent('openai:gpt-4.1', instructions='Be fun!')
@@ -171,7 +171,7 @@ validate state contained in [`RunAgentInput.state`](https://docs.ag-ui.com/sdk/j
171171
If the `state` field's type is a Pydantic `BaseModel` subclass, the raw state dictionary on the request is automatically validated. If not, you can validate the raw value yourself in your dependencies dataclass's `__post_init__` method.
172172

173173

174-
```python {title="ag_ui_state.py" py="3.10"}
174+
```python {title="ag_ui_state.py"}
175175
from pydantic import BaseModel
176176

177177
from pydantic_ai import Agent
@@ -211,7 +211,7 @@ which returns a (subclass of)
211211
[`BaseEvent`](https://docs.ag-ui.com/sdk/python/core/events#baseevent), which allows
212212
for custom events and state updates.
213213

214-
```python {title="ag_ui_tool_events.py" py="3.10"}
214+
```python {title="ag_ui_tool_events.py"}
215215
from ag_ui.core import CustomEvent, EventType, StateSnapshotEvent
216216
from pydantic import BaseModel
217217

docs/examples/weather-agent.md

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -35,7 +35,6 @@ Here's what the UI looks like for the weather agent:
3535

3636
{{ video('c549d8d8827ded15f326f998e428e6c3', 6) }}
3737

38-
Note, to run the UI, you'll need Python 3.10+.
3938

4039
```bash
4140
pip install gradio>=5.9.0

docs/graph.md

Lines changed: 19 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -122,7 +122,7 @@ class MyNode(BaseNode[MyState, None, int]): # (1)!
122122

123123
Here's an example of a simple graph:
124124

125-
```py {title="graph_example.py" py="3.10"}
125+
```py {title="graph_example.py"}
126126
from __future__ import annotations
127127

128128
from dataclasses import dataclass
@@ -163,11 +163,11 @@ print(result.output)
163163
3. The graph is created with a sequence of nodes.
164164
4. The graph is run synchronously with [`run_sync`][pydantic_graph.graph.Graph.run_sync]. The initial node is `DivisibleBy5(4)`. Because the graph doesn't use external state or deps, we don't pass `state` or `deps`.
165165

166-
_(This example is complete, it can be run "as is" with Python 3.10+)_
166+
_(This example is complete, it can be run "as is")_
167167

168168
A [mermaid diagram](#mermaid-diagrams) for this graph can be generated with the following code:
169169

170-
```py {title="graph_example_diagram.py" py="3.10" requires="graph_example.py"}
170+
```py {title="graph_example_diagram.py" requires="graph_example.py"}
171171
from graph_example import DivisibleBy5, fives_graph
172172

173173
fives_graph.mermaid_code(start_node=DivisibleBy5)
@@ -201,7 +201,7 @@ The "state" concept in `pydantic-graph` provides an optional way to access and m
201201

202202
Here's an example of a graph which represents a vending machine where the user may insert coins and select a product to purchase.
203203

204-
```python {title="vending_machine.py" py="3.10"}
204+
```python {title="vending_machine.py"}
205205
from __future__ import annotations
206206

207207
from dataclasses import dataclass
@@ -304,11 +304,11 @@ async def main():
304304
17. The return type of `CoinsInserted`'s [`run`][pydantic_graph.nodes.BaseNode.run] method is a union, meaning multiple outgoing edges are possible.
305305
18. Unlike other nodes, `Purchase` can end the run, so the [`RunEndT`][pydantic_graph.nodes.RunEndT] generic parameter must be set. In this case it's `None` since the graph run return type is `None`.
306306

307-
_(This example is complete, it can be run "as is" with Python 3.10+ — you'll need to add `asyncio.run(main())` to run `main`)_
307+
_(This example is complete, it can be run "as is" — you'll need to add `asyncio.run(main())` to run `main`)_
308308

309309
A [mermaid diagram](#mermaid-diagrams) for this graph can be generated with the following code:
310310

311-
```py {title="vending_machine_diagram.py" py="3.10" requires="vending_machine.py"}
311+
```py {title="vending_machine_diagram.py" requires="vending_machine.py"}
312312
from vending_machine import InsertCoin, vending_machine_graph
313313

314314
vending_machine_graph.mermaid_code(start_node=InsertCoin)
@@ -352,7 +352,7 @@ stateDiagram-v2
352352
Feedback --> [*]
353353
```
354354

355-
```python {title="genai_email_feedback.py" py="3.10"}
355+
```python {title="genai_email_feedback.py"}
356356
from __future__ import annotations as _annotations
357357

358358
from dataclasses import dataclass, field
@@ -466,7 +466,7 @@ async def main():
466466
"""
467467
```
468468

469-
_(This example is complete, it can be run "as is" with Python 3.10+ — you'll need to add `asyncio.run(main())` to run `main`)_
469+
_(This example is complete, it can be run "as is" — you'll need to add `asyncio.run(main())` to run `main`)_
470470

471471
## Iterating Over a Graph
472472

@@ -476,7 +476,7 @@ Sometimes you want direct control or insight into each node as the graph execute
476476

477477
Here's an example:
478478

479-
```python {title="count_down.py" noqa="I001" py="3.10"}
479+
```python {title="count_down.py" noqa="I001"}
480480
from __future__ import annotations as _annotations
481481

482482
from dataclasses import dataclass
@@ -524,7 +524,7 @@ Alternatively, you can drive iteration manually with the [`GraphRun.next`][pydan
524524

525525
Below is a contrived example that stops whenever the counter is at 2, ignoring any node runs beyond that:
526526

527-
```python {title="count_down_next.py" noqa="I001" py="3.10" requires="count_down.py"}
527+
```python {title="count_down_next.py" noqa="I001" requires="count_down.py"}
528528
from pydantic_graph import End, FullStatePersistence
529529
from count_down import CountDown, CountDownState, count_down_graph
530530

@@ -593,7 +593,7 @@ We can run the `count_down_graph` from [above](#iterating-over-a-graph), using [
593593

594594
As you can see in this code, `run_node` requires no external application state (apart from state persistence) to be run, meaning graphs can easily be executed by distributed execution and queueing systems.
595595

596-
```python {title="count_down_from_persistence.py" noqa="I001" py="3.10" requires="count_down.py"}
596+
```python {title="count_down_from_persistence.py" noqa="I001" requires="count_down.py"}
597597
from pathlib import Path
598598

599599
from pydantic_graph import End
@@ -637,7 +637,7 @@ async def run_node(run_id: str) -> bool: # (3)!
637637
5. [`graph.run()`][pydantic_graph.graph.Graph.run] will return either a [node][pydantic_graph.nodes.BaseNode] or an [`End`][pydantic_graph.nodes.End] object.
638638
6. Check if the node is an [`End`][pydantic_graph.nodes.End] object, if it is, the graph run is complete.
639639

640-
_(This example is complete, it can be run "as is" with Python 3.10+ — you'll need to add `asyncio.run(main())` to run `main`)_
640+
_(This example is complete, it can be run "as is" — you'll need to add `asyncio.run(main())` to run `main`)_
641641

642642
### Example: Human in the loop.
643643

@@ -648,7 +648,7 @@ In this example, an AI asks the user a question, the user provides an answer, th
648648
Instead of running the entire graph in a single process invocation, we run the graph by running the process repeatedly, optionally providing an answer to the question as a command line argument.
649649

650650
??? example "`ai_q_and_a_graph.py``question_graph` definition"
651-
```python {title="ai_q_and_a_graph.py" noqa="I001" py="3.10"}
651+
```python {title="ai_q_and_a_graph.py" noqa="I001"}
652652
from __future__ import annotations as _annotations
653653

654654
from typing import Annotated
@@ -748,9 +748,9 @@ question_graph = Graph(
748748
)
749749
```
750750

751-
_(This example is complete, it can be run "as is" with Python 3.10+)_
751+
_(This example is complete, it can be run "as is")_
752752

753-
```python {title="ai_q_and_a_run.py" noqa="I001" py="3.10" requires="ai_q_and_a_graph.py"}
753+
```python {title="ai_q_and_a_run.py" noqa="I001" requires="ai_q_and_a_graph.py"}
754754
import sys
755755
from pathlib import Path
756756

@@ -799,7 +799,7 @@ async def main():
799799
8. To demonstrate the state persistence, we call [`load_all`][pydantic_graph.persistence.BaseStatePersistence.load_all] to get all the snapshots from the persistence instance. This will return a list of [`Snapshot`][pydantic_graph.persistence.Snapshot] objects.
800800
9. If the node is an `Answer` object, we print the question and break out of the loop to end the process and wait for user input.
801801

802-
_(This example is complete, it can be run "as is" with Python 3.10+ — you'll need to add `asyncio.run(main())` to run `main`)_
802+
_(This example is complete, it can be run "as is" — you'll need to add `asyncio.run(main())` to run `main`)_
803803

804804
For a complete example of this graph, see the [question graph example](examples/question-graph.md).
805805

@@ -809,7 +809,7 @@ As with Pydantic AI, `pydantic-graph` supports dependency injection via a generi
809809

810810
As an example of dependency injection, let's modify the `DivisibleBy5` example [above](#graph) to use a [`ProcessPoolExecutor`][concurrent.futures.ProcessPoolExecutor] to run the compute load in a separate process (this is a contrived example, `ProcessPoolExecutor` wouldn't actually improve performance in this example):
811811

812-
```py {title="deps_example.py" py="3.10" test="skip" hl_lines="4 10-12 35-37 48-49"}
812+
```py {title="deps_example.py" test="skip" hl_lines="4 10-12 35-37 48-49"}
813813
from __future__ import annotations
814814

815815
import asyncio
@@ -877,7 +877,7 @@ async def main():
877877
"""
878878
```
879879

880-
_(This example is complete, it can be run "as is" with Python 3.10+ — you'll need to add `asyncio.run(main())` to run `main`)_
880+
_(This example is complete, it can be run "as is" — you'll need to add `asyncio.run(main())` to run `main`)_
881881

882882
## Mermaid Diagrams
883883

@@ -1025,7 +1025,7 @@ You can specify the direction of the state diagram using one of the following va
10251025

10261026
Here is an example of how to do this using 'Left to Right' (LR) instead of the default 'Top to Bottom' (TB):
10271027

1028-
```py {title="vending_machine_diagram.py" py="3.10" requires="vending_machine.py"}
1028+
```py {title="vending_machine_diagram.py" requires="vending_machine.py"}
10291029
from vending_machine import InsertCoin, vending_machine_graph
10301030

10311031
vending_machine_graph.mermaid_code(start_node=InsertCoin, direction='LR')

docs/mcp/client.md

Lines changed: 15 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -11,9 +11,6 @@ You need to either install [`pydantic-ai`](../install.md), or[`pydantic-ai-slim`
1111
pip/uv-add "pydantic-ai-slim[mcp]"
1212
```
1313

14-
!!! note
15-
MCP integration requires Python 3.10 or higher.
16-
1714
## Usage
1815

1916
Pydantic AI comes with two ways to connect to MCP servers:
@@ -38,7 +35,7 @@ You can use the [`async with agent`][pydantic_ai.Agent.__aenter__] context manag
3835

3936
Before creating the Streamable HTTP client, we need to run a server that supports the Streamable HTTP transport.
4037

41-
```python {title="streamable_http_server.py" py="3.10" dunder_name="not_main"}
38+
```python {title="streamable_http_server.py" dunder_name="not_main"}
4239
from mcp.server.fastmcp import FastMCP
4340

4441
app = FastMCP()
@@ -53,7 +50,7 @@ if __name__ == '__main__':
5350

5451
Then we can create the client:
5552

56-
```python {title="mcp_streamable_http_client.py" py="3.10"}
53+
```python {title="mcp_streamable_http_client.py"}
5754
from pydantic_ai import Agent
5855
from pydantic_ai.mcp import MCPServerStreamableHTTP
5956

@@ -71,7 +68,7 @@ async def main():
7168
2. Create an agent with the MCP server attached.
7269
3. Create a client session to connect to the server.
7370

74-
_(This example is complete, it can be run "as is" with Python 3.10+ — you'll need to add `asyncio.run(main())` to run `main`)_
71+
_(This example is complete, it can be run "as is" — you'll need to add `asyncio.run(main())` to run `main`)_
7572

7673
**What's happening here?**
7774

@@ -113,7 +110,7 @@ deno run \
113110
jsr:@pydantic/mcp-run-python sse
114111
```
115112

116-
```python {title="mcp_sse_client.py" py="3.10"}
113+
```python {title="mcp_sse_client.py"}
117114
from pydantic_ai import Agent
118115
from pydantic_ai.mcp import MCPServerSSE
119116

@@ -132,13 +129,13 @@ async def main():
132129
2. Create an agent with the MCP server attached.
133130
3. Create a client session to connect to the server.
134131

135-
_(This example is complete, it can be run "as is" with Python 3.10+ — you'll need to add `asyncio.run(main())` to run `main`)_
132+
_(This example is complete, it can be run "as is" — you'll need to add `asyncio.run(main())` to run `main`)_
136133

137134
### MCP "stdio" Server
138135

139136
The other transport offered by MCP is the [stdio transport](https://spec.modelcontextprotocol.io/specification/2024-11-05/basic/transports/#stdio) where the server is run as a subprocess and communicates with the client over `stdin` and `stdout`. In this case, you'd use the [`MCPServerStdio`][pydantic_ai.mcp.MCPServerStdio] class.
140137

141-
```python {title="mcp_stdio_client.py" py="3.10"}
138+
```python {title="mcp_stdio_client.py"}
142139
from pydantic_ai import Agent
143140
from pydantic_ai.mcp import MCPServerStdio
144141

@@ -174,7 +171,7 @@ the customisation of tool call requests and their responses.
174171
A common use case for this is to inject metadata to the requests which the server
175172
call needs.
176173

177-
```python {title="mcp_process_tool_call.py" py="3.10"}
174+
```python {title="mcp_process_tool_call.py"}
178175
from typing import Any
179176

180177
from pydantic_ai import Agent
@@ -214,7 +211,7 @@ When connecting to multiple MCP servers that might provide tools with the same n
214211

215212
This allows you to use multiple servers that might have overlapping tool names without conflicts:
216213

217-
```python {title="mcp_tool_prefix_http_client.py" py="3.10"}
214+
```python {title="mcp_tool_prefix_http_client.py"}
218215
from pydantic_ai import Agent
219216
from pydantic_ai.mcp import MCPServerSSE
220217

@@ -247,7 +244,7 @@ All HTTP-based MCP client classes
247244
parameter that lets you pass your own pre-configured
248245
[`httpx.AsyncClient`](https://www.python-httpx.org/async/).
249246

250-
```python {title="mcp_custom_tls_client.py" py="3.10"}
247+
```python {title="mcp_custom_tls_client.py"}
251248
import httpx
252249
import ssl
253250

@@ -324,7 +321,7 @@ Let's say we have an MCP server that wants to use sampling (in this case to gene
324321

325322
??? example "Sampling MCP Server"
326323

327-
```python {title="generate_svg.py" py="3.10"}
324+
```python {title="generate_svg.py"}
328325
import re
329326
from pathlib import Path
330327

@@ -362,7 +359,7 @@ Let's say we have an MCP server that wants to use sampling (in this case to gene
362359

363360
Using this server with an `Agent` will automatically allow sampling:
364361

365-
```python {title="sampling_mcp_client.py" py="3.10" requires="generate_svg.py"}
362+
```python {title="sampling_mcp_client.py" requires="generate_svg.py"}
366363
from pydantic_ai import Agent
367364
from pydantic_ai.mcp import MCPServerStdio
368365

@@ -378,11 +375,11 @@ async def main():
378375
#> Image file written to robot_punk.svg.
379376
```
380377

381-
_(This example is complete, it can be run "as is" with Python 3.10+)_
378+
_(This example is complete, it can be run "as is")_
382379

383380
You can disallow sampling by setting [`allow_sampling=False`][pydantic_ai.mcp.MCPServer.allow_sampling] when creating the server reference, e.g.:
384381

385-
```python {title="sampling_disallowed.py" hl_lines="6" py="3.10"}
382+
```python {title="sampling_disallowed.py" hl_lines="6"}
386383
from pydantic_ai.mcp import MCPServerStdio
387384

388385
server = MCPServerStdio(
@@ -418,7 +415,7 @@ This allows for a more interactive and user-friendly experience, especially for
418415

419416
To enable elicitation, provide an [`elicitation_callback`][pydantic_ai.mcp.MCPServer.elicitation_callback] function when creating your MCP server instance:
420417

421-
```python {title="restaurant_server.py" py="3.10"}
418+
```python {title="restaurant_server.py"}
422419
from mcp.server.fastmcp import Context, FastMCP
423420
from pydantic import BaseModel, Field
424421

@@ -454,7 +451,7 @@ if __name__ == '__main__':
454451

455452
This server demonstrates elicitation by requesting structured booking details from the client when the `book_table` tool is called. Here's how to create a client that handles these elicitation requests:
456453

457-
```python {title="client_example.py" py="3.10" requires="restaurant_server.py" test="skip"}
454+
```python {title="client_example.py" requires="restaurant_server.py" test="skip"}
458455
import asyncio
459456
from typing import Any
460457

docs/mcp/run-python.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -52,7 +52,7 @@ Usage of `jsr:@pydantic/mcp-run-python` with Pydantic AI is described in the [cl
5252

5353
As well as using this server with Pydantic AI, it can be connected to other MCP clients. For clarity, in this example we connect directly using the [Python MCP client](https://github.com/modelcontextprotocol/python-sdk).
5454

55-
```python {title="mcp_run_python.py" py="3.10"}
55+
```python {title="mcp_run_python.py"}
5656
from mcp import ClientSession, StdioServerParameters
5757
from mcp.client.stdio import stdio_client
5858

@@ -127,7 +127,7 @@ As introduced in PEP 723, explained [here](https://packaging.python.org/en/lates
127127

128128
This allows use of dependencies that aren't imported in the code, and is more explicit.
129129

130-
```py {title="inline_script_metadata.py" py="3.10" requires="mcp_run_python.py"}
130+
```py {title="inline_script_metadata.py" requires="mcp_run_python.py"}
131131
from mcp import ClientSession
132132
from mcp.client.stdio import stdio_client
133133

docs/mcp/server.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ Pydantic AI models can also be used within MCP Servers.
66

77
Here's a simple example of a [Python MCP server](https://github.com/modelcontextprotocol/python-sdk) using Pydantic AI within a tool call:
88

9-
```py {title="mcp_server.py" py="3.10"}
9+
```py {title="mcp_server.py"}
1010
from mcp.server.fastmcp import FastMCP
1111

1212
from pydantic_ai import Agent
@@ -32,7 +32,7 @@ if __name__ == '__main__':
3232

3333
This server can be queried with any MCP client. Here is an example using the Python SDK directly:
3434

35-
```py {title="mcp_client.py" py="3.10" requires="mcp_server.py" dunder_name="not_main"}
35+
```py {title="mcp_client.py" requires="mcp_server.py" dunder_name="not_main"}
3636
import asyncio
3737
import os
3838

@@ -70,7 +70,7 @@ When Pydantic AI agents are used within MCP servers, they can use sampling via [
7070

7171
We can extend the above example to use sampling so instead of connecting directly to the LLM, the agent calls back through the MCP client to make LLM calls.
7272

73-
```py {title="mcp_server_sampling.py" py="3.10"}
73+
```py {title="mcp_server_sampling.py"}
7474
from mcp.server.fastmcp import Context, FastMCP
7575

7676
from pydantic_ai import Agent
@@ -95,7 +95,7 @@ The [above](#simple-client) client does not support sampling, so if you tried to
9595

9696
The simplest way to support sampling in an MCP client is to [use](./client.md#mcp-sampling) a Pydantic AI agent as the client, but if you wanted to support sampling with the vanilla MCP SDK, you could do so like this:
9797

98-
```py {title="mcp_client_sampling.py" py="3.10" requires="mcp_server_sampling.py"}
98+
```py {title="mcp_client_sampling.py" requires="mcp_server_sampling.py"}
9999
import asyncio
100100
from typing import Any
101101

@@ -150,4 +150,4 @@ if __name__ == '__main__':
150150
asyncio.run(client())
151151
```
152152

153-
_(This example is complete, it can be run "as is" with Python 3.10+)_
153+
_(This example is complete, it can be run "as is")_

pydantic_graph/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ Full documentation is available at [ai.pydantic.dev/graph](https://ai.pydantic.d
1919

2020
Here's a basic example:
2121

22-
```python {noqa="I001" py="3.10"}
22+
```python {noqa="I001"}
2323
from __future__ import annotations
2424

2525
from dataclasses import dataclass

0 commit comments

Comments
 (0)