Skip to content

Commit 3b7f299

Browse files
authored
Move mcp-run-python (#2776)
1 parent 7e8ebec commit 3b7f299

28 files changed

+68
-2078
lines changed

.github/workflows/ci.yml

Lines changed: 2 additions & 32 deletions
Original file line numberDiff line numberDiff line change
@@ -29,10 +29,6 @@ jobs:
2929
- name: Install dependencies
3030
run: uv sync --all-extras --all-packages --group lint
3131

32-
- uses: denoland/setup-deno@v2
33-
with:
34-
deno-version: v2.x
35-
3632
- uses: pre-commit/[email protected]
3733
with:
3834
extra_args: --all-files --verbose
@@ -162,6 +158,7 @@ jobs:
162158

163159
- run: mkdir .coverage
164160

161+
- run: uv run mcp-run-python example --deps=numpy
165162
- run: uv sync --only-dev
166163
- run: uv run ${{ matrix.install.command }} coverage run -m pytest --durations=100 -n auto --dist=loadgroup
167164
env:
@@ -200,6 +197,7 @@ jobs:
200197
- run: mkdir .coverage
201198

202199
- run: uv sync --group dev
200+
- run: uv run mcp-run-python example --deps=numpy
203201

204202
- run: unset UV_FROZEN
205203

@@ -232,10 +230,6 @@ jobs:
232230
with:
233231
enable-cache: true
234232

235-
- uses: denoland/setup-deno@v2
236-
with:
237-
deno-version: v2.x
238-
239233
- run: uv run --all-extras python tests/import_examples.py
240234

241235
coverage:
@@ -272,29 +266,6 @@ jobs:
272266
path: htmlcov
273267
include-hidden-files: true
274268

275-
test-mcp-run-python:
276-
runs-on: ubuntu-latest
277-
timeout-minutes: 5
278-
env:
279-
UV_PYTHON: "3.12"
280-
steps:
281-
- uses: actions/checkout@v4
282-
283-
- uses: astral-sh/setup-uv@v5
284-
with:
285-
enable-cache: true
286-
287-
- uses: denoland/setup-deno@v2
288-
with:
289-
deno-version: v2.x
290-
291-
- run: make lint-js
292-
293-
- run: uv run --package mcp-run-python pytest mcp-run-python -v --durations=100
294-
295-
- run: deno task dev warmup
296-
working-directory: mcp-run-python
297-
298269
# https://github.com/marketplace/actions/alls-green#why used for branch protection checks
299270
check:
300271
if: always()
@@ -307,7 +278,6 @@ jobs:
307278
- test-lowest-versions
308279
- test-examples
309280
- coverage
310-
- test-mcp-run-python
311281
runs-on: ubuntu-latest
312282

313283
steps:

.pre-commit-config.yaml

Lines changed: 0 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -44,14 +44,6 @@ repos:
4444
types: [python]
4545
language: system
4646
pass_filenames: false
47-
- id: lint-js
48-
name: Lint JS
49-
entry: make
50-
args: [lint-js]
51-
language: system
52-
types_or: [javascript, ts, json]
53-
files: "^mcp-run-python/"
54-
pass_filenames: false
5547
- id: clai-help
5648
name: clai help output
5749
entry: uv

CLAUDE.md

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -83,7 +83,6 @@ This is a uv workspace with multiple packages:
8383
- **`pydantic_graph/`**: Graph execution engine
8484
- **`examples/`**: Example applications
8585
- **`clai/`**: CLI tool
86-
- **`mcp-run-python/`**: MCP server implementation (Deno/TypeScript)
8786

8887
## Testing Strategy
8988

Makefile

Lines changed: 1 addition & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -8,12 +8,8 @@
88
.pre-commit: ## Check that pre-commit is installed
99
@pre-commit -V || echo 'Please install pre-commit: https://pre-commit.com/'
1010

11-
.PHONY: .deno
12-
.deno: ## Check that deno is installed
13-
@deno --version > /dev/null 2>&1 || (printf "\033[0;31m✖ Error: deno is not installed, but is needed for mcp-run-python\033[0m\n Please install deno: https://docs.deno.com/runtime/getting_started/installation/\n" && exit 1)
14-
1511
.PHONY: install
16-
install: .uv .pre-commit .deno ## Install the package, dependencies, and pre-commit for local development
12+
install: .uv .pre-commit ## Install the package, dependencies, and pre-commit for local development
1713
uv sync --frozen --all-extras --all-packages --group lint --group docs
1814
pre-commit install --install-hooks
1915

@@ -38,10 +34,6 @@ lint: ## Lint the code
3834
uv run ruff format --check
3935
uv run ruff check
4036

41-
.PHONY: lint-js
42-
lint-js: ## Lint JS and TS code
43-
cd mcp-run-python && deno task lint-format
44-
4537
.PHONY: typecheck-pyright
4638
typecheck-pyright:
4739
@# PYRIGHT_PYTHON_IGNORE_WARNINGS avoids the overhead of making a request to github on every invocation
@@ -77,11 +69,6 @@ testcov: test ## Run tests and generate an HTML coverage report
7769
@echo "building coverage html"
7870
@uv run coverage html
7971

80-
.PHONY: test-mrp
81-
test-mrp: ## Build and tests of mcp-run-python
82-
cd mcp-run-python && deno task build
83-
uv run --package mcp-run-python pytest mcp-run-python -v
84-
8572
.PHONY: update-examples
8673
update-examples: ## Update documentation examples
8774
uv run -m pytest --update-examples tests/test_examples.py

docs/mcp/client.md

Lines changed: 25 additions & 25 deletions
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ Pydantic AI comes with two ways to connect to MCP servers:
1919
- [`MCPServerSSE`][pydantic_ai.mcp.MCPServerSSE] which connects to an MCP server using the [HTTP SSE](https://spec.modelcontextprotocol.io/specification/2024-11-05/basic/transports/#http-with-sse) transport
2020
- [`MCPServerStdio`][pydantic_ai.mcp.MCPServerStdio] which runs the server as a subprocess and connects to it using the [stdio](https://spec.modelcontextprotocol.io/specification/2024-11-05/basic/transports/#stdio) transport
2121

22-
Examples of all three are shown below; [mcp-run-python](run-python.md) is used as the MCP server in all examples.
22+
Examples of all three are shown below.
2323

2424
Each MCP server instance is a [toolset](../toolsets.md) and can be registered with an [`Agent`][pydantic_ai.Agent] using the `toolsets` argument.
2525

@@ -59,9 +59,9 @@ agent = Agent('openai:gpt-4o', toolsets=[server]) # (2)!
5959

6060
async def main():
6161
async with agent: # (3)!
62-
result = await agent.run('How many days between 2000-01-01 and 2025-03-18?')
62+
result = await agent.run('What is 7 plus 5?')
6363
print(result.output)
64-
#> There are 9,208 days between January 1, 2000, and March 18, 2025.
64+
#> The answer is 12.
6565
```
6666

6767
1. Define the MCP server with the URL used to connect.
@@ -97,19 +97,26 @@ Will display as follows:
9797
[`MCPServerSSE`][pydantic_ai.mcp.MCPServerSSE] connects over HTTP using the [HTTP + Server Sent Events transport](https://spec.modelcontextprotocol.io/specification/2024-11-05/basic/transports/#http-with-sse) to a server.
9898

9999
!!! note
100-
[`MCPServerSSE`][pydantic_ai.mcp.MCPServerSSE] requires an MCP server to be running and accepting HTTP connections before running the agent. Running the server is not managed by Pydantic AI.
100+
The SSE transport in MCP is deprecated, you should use Streamable HTTP instead.
101+
102+
Before creating the SSE client, we need to run a server that supports the SSE transport.
103+
101104

102-
The name "HTTP" is used since this implementation will be adapted in future to use the new
103-
[Streamable HTTP](https://github.com/modelcontextprotocol/specification/pull/206) currently in development.
105+
```python {title="sse_server.py" dunder_name="not_main"}
106+
from mcp.server.fastmcp import FastMCP
107+
108+
app = FastMCP()
104109

105-
Before creating the SSE client, we need to run the server (docs [here](run-python.md)):
110+
@app.tool()
111+
def add(a: int, b: int) -> int:
112+
return a + b
106113

107-
```bash {title="terminal (run sse server)"}
108-
deno run \
109-
-N -R=node_modules -W=node_modules --node-modules-dir=auto \
110-
jsr:@pydantic/mcp-run-python sse
114+
if __name__ == '__main__':
115+
app.run(transport='sse')
111116
```
112117

118+
Then we can create the client:
119+
113120
```python {title="mcp_sse_client.py"}
114121
from pydantic_ai import Agent
115122
from pydantic_ai.mcp import MCPServerSSE
@@ -120,9 +127,9 @@ agent = Agent('openai:gpt-4o', toolsets=[server]) # (2)!
120127

121128
async def main():
122129
async with agent: # (3)!
123-
result = await agent.run('How many days between 2000-01-01 and 2025-03-18?')
130+
result = await agent.run('What is 7 plus 5?')
124131
print(result.output)
125-
#> There are 9,208 days between January 1, 2000, and March 18, 2025.
132+
#> The answer is 12.
126133
```
127134

128135
1. Define the MCP server with the URL used to connect.
@@ -133,23 +140,16 @@ _(This example is complete, it can be run "as is" — you'll need to add `asynci
133140

134141
### MCP "stdio" Server
135142

136-
The other transport offered by MCP is the [stdio transport](https://spec.modelcontextprotocol.io/specification/2024-11-05/basic/transports/#stdio) where the server is run as a subprocess and communicates with the client over `stdin` and `stdout`. In this case, you'd use the [`MCPServerStdio`][pydantic_ai.mcp.MCPServerStdio] class.
143+
MCP also offers [stdio transport](https://spec.modelcontextprotocol.io/specification/2024-11-05/basic/transports/#stdio) where the server is run as a subprocess and communicates with the client over `stdin` and `stdout`. In this case, you'd use the [`MCPServerStdio`][pydantic_ai.mcp.MCPServerStdio] class.
144+
145+
In this example [mcp-run-python](https://github.com/pydantic/mcp-run-python) is used as the MCP server.
137146

138147
```python {title="mcp_stdio_client.py"}
139148
from pydantic_ai import Agent
140149
from pydantic_ai.mcp import MCPServerStdio
141150

142151
server = MCPServerStdio( # (1)!
143-
'deno',
144-
args=[
145-
'run',
146-
'-N',
147-
'-R=node_modules',
148-
'-W=node_modules',
149-
'--node-modules-dir=auto',
150-
'jsr:@pydantic/mcp-run-python',
151-
'stdio',
152-
]
152+
'uv', args=['run', 'mcp-run-python', 'stdio'], timeout=10
153153
)
154154
agent = Agent('openai:gpt-4o', toolsets=[server])
155155

@@ -161,7 +161,7 @@ async def main():
161161
#> There are 9,208 days between January 1, 2000, and March 18, 2025.
162162
```
163163

164-
1. See [MCP Run Python](run-python.md) for more information.
164+
1. See [MCP Run Python](https://github.com/pydantic/mcp-run-python) for more information.
165165

166166
## Tool call customisation
167167

docs/mcp/overview.md

Lines changed: 2 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,9 @@
11
# Model Context Protocol (MCP)
22

3-
Pydantic AI supports [Model Context Protocol (MCP)](https://modelcontextprotocol.io) in three ways:
3+
Pydantic AI supports [Model Context Protocol (MCP)](https://modelcontextprotocol.io) in two ways:
44

55
1. [Agents](../agents.md) act as an MCP Client, connecting to MCP servers to use their tools, [learn more …](client.md)
66
2. Agents can be used within MCP servers, [learn more …](server.md)
7-
3. As part of Pydantic AI, we're building a number of MCP servers, [see below](#mcp-servers)
87

98
## What is MCP?
109

@@ -18,12 +17,4 @@ Some examples of what this means:
1817

1918
- Pydantic AI could use a web search service implemented as an MCP server to implement a deep research agent
2019
- Cursor could connect to the [Pydantic Logfire](https://github.com/pydantic/logfire-mcp) MCP server to search logs, traces and metrics to gain context while fixing a bug
21-
- Pydantic AI, or any other MCP client could connect to our [Run Python](run-python.md) MCP server to run arbitrary Python code in a sandboxed environment
22-
23-
## MCP Servers
24-
25-
To add functionality to Pydantic AI while making it as widely usable as possible, we're implementing some functionality as MCP servers.
26-
27-
So far, we've only implemented one MCP server as part of Pydantic AI:
28-
29-
- [Run Python](run-python.md): A sandboxed Python interpreter that can run arbitrary code, with a focus on security and safety.
20+
- Pydantic AI, or any other MCP client could connect to our [Run Python](https://github.com/pydantic/mcp-run-python) MCP server to run arbitrary Python code in a sandboxed environment

0 commit comments

Comments
 (0)