Skip to content

Commit a1bdeec

Browse files
authored
Merge pull request #1938 from jc2409/main
MCP AI Agent Learning Path
2 parents 358373c + 0050ee6 commit a1bdeec

File tree

6 files changed

+350
-0
lines changed

6 files changed

+350
-0
lines changed
Lines changed: 50 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,50 @@
1+
---
2+
title: Deploy an MCP server on a Raspberry Pi 5 and interact with it using the AI agent
3+
4+
minutes_to_complete: 30
5+
6+
who_is_this_for: This Learning Path targets LLM and IoT developers who already know their way around Large Language Model (LLM) concepts and networking. It walks you through deploying a lightweight Model Context Protocol (MCP) server on a Raspberry Pi 5 and shows you how to interact with it via the OpenAI-Agent SDK.
7+
8+
learning_objectives:
9+
- Understand how to Deploy a lightweight Model Context Protocol (MCP) server on Raspberry Pi 5
10+
- Design and register custom tools for the AI Agent
11+
- Create custom endpoints
12+
- Learn uv — a fast, efficient Python package manager
13+
14+
prerequisites:
15+
- Rapberry Pi
16+
- Basic understanding of Python and prompt engineering.
17+
- Understanding of LLM and AI Agent fundamentals
18+
19+
author: Andrew Choi
20+
21+
skilllevels: Introductory
22+
subjects: ML
23+
armips:
24+
- Cortex-A76
25+
tools_software_languages:
26+
- Python
27+
- IoT
28+
- AI
29+
- MCP
30+
31+
operatingsystems:
32+
- Linux
33+
34+
further_reading:
35+
- resource:
36+
title: fastmcp
37+
link: https://github.com/jlowin/fastmcp
38+
type: documentation
39+
- resource:
40+
title: OpenAI Agents SDK
41+
link: https://openai.github.io/openai-agents-python/
42+
type: blog
43+
44+
45+
### FIXED, DO NOT MODIFY
46+
# ================================================================================
47+
weight: 1 # _index.md always has weight of 1 to order correctly
48+
layout: "learningpathall" # All files under learning paths have this same wrapper
49+
learning_path_main_page: "yes" # This should be surfaced when looking for related content. Only set for _index.md of learning path content.
50+
---
Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,8 @@
1+
---
2+
# ================================================================================
3+
# FIXED, DO NOT MODIFY THIS FILE
4+
# ================================================================================
5+
weight: 21 # Set to always be larger than the content in this path to be at the end of the navigation.
6+
title: "Next Steps" # Always the same, html page title.
7+
layout: "learningpathall" # All files under learning paths have this same wrapper for Hugo processing.
8+
---
Lines changed: 82 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,82 @@
1+
---
2+
title: Introduction to Model Context Protocol and uv
3+
weight: 2
4+
5+
### FIXED, DO NOT MODIFY
6+
layout: learningpathall
7+
---
8+
9+
## Model Context Protocol (MCP)
10+
11+
The **Model Context Protocol (MCP)** is an open specification for wiring Large-Language-Model (LLM) agents to the *context* they need — whether that context is a database, a local sensor, or a SaaS API.
12+
Think of it as USB-C for AI: once a tool or data source speaks MCP, any compliant LLM client can “plug in” and start using it immediately.
13+
14+
### Why use MCP?
15+
- **Plug-and-play integrations:** A growing catalog of pre-built MCP servers (filesystem, shell, vector stores, web-scraping, etc.) gives your agent instant super-powers with zero custom glue code.
16+
17+
- **Model/vendor agnostic:** Because the protocol lives outside the model, you can swap GPT-4, Claude, or your own fine-tuned model without touching the integration layer.
18+
19+
- **Security by design:** MCP encourages running servers inside your own infrastructure, so sensitive data never leaves the perimeter unless you choose.
20+
21+
- **Cross-ecosystem momentum:** Recent roll-outs—from an official C# SDK to Wix’s production MCP server and Microsoft’s Azure support—show the spec is gathering real-world traction.
22+
23+
### High-level architecture
24+
![mcp server](./mcp.png)
25+
- **MCP Host:** the LLM-powered application (Claude Desktop, an IDE plugin, OpenAI Agents SDK, etc.).
26+
- **MCP Client:** the runtime shim that keeps a 1-to-1 connection with each server.
27+
- **MCP Server:** a lightweight process that advertises tools (functions) over MCP.
28+
- **Local data sources:** files, databases, or sensors your server can read directly.
29+
- **Remote services:** external APIs the server can call on the host’s behalf.
30+
31+
{{% notice Note %}}
32+
Learn more about AI Agents in the [AI Agent on CPU learning path](https://learn.arm.com/learning-paths/servers-and-cloud-computing/ai-agent-on-cpu/).
33+
{{% /notice %}}
34+
35+
## UV: The Fast, All-in-One Python Package Manager
36+
37+
**uv** is a next-generation, Rust-based package manager that unifies pip, virtualenv, Poetry, and more—offering 10×–100× faster installs, built-in virtual environment handling, robust lockfiles, and full compatibility with the Python ecosystem.
38+
39+
### Install uv
40+
- macOS / Linux
41+
```bash
42+
curl -LsSf https://astral.sh/uv/install.sh | sh
43+
```
44+
- Windows
45+
```bash
46+
powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"
47+
```
48+
49+
### Initialize a Project
50+
1. Create & enter your project folder:
51+
```bash
52+
mkdir my-project && cd my-project
53+
```
54+
2. Run
55+
```bash
56+
uv init
57+
```
58+
59+
This scaffolds:
60+
- .venv/ (auto-created virtual environment)
61+
- pyproject.toml (project metadata & dependencies)
62+
- .python-version (pinned interpreter)
63+
- README.md, .gitignore, and a sample main.py
64+
65+
### Install Dependencies
66+
- Add one or more packages to your project:
67+
```bash
68+
uv add requests numpy pandas
69+
```
70+
> Updates both pyproject.toml and the lockfile (uv.lock)
71+
72+
- Remove a package (and its unused sub-deps):
73+
```bash
74+
uv remove numpy
75+
```
76+
77+
- Install from an existing requirements.txt (e.g. when migrating):
78+
```bash:
79+
uv pip install -r requirements.txt
80+
```
81+
82+
All installs happen inside your project’s .venv, and UV’s lockfile guarantees repeatable environments.
Lines changed: 95 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,95 @@
1+
---
2+
title: Build & Run an AI Agent on Your Workstation
3+
weight: 4
4+
5+
### FIXED, DO NOT MODIFY
6+
layout: learningpathall
7+
---
8+
9+
### Create an AI Agent and point it at your Pi's MCP Server
10+
1. Bootstrap the Agent Project
11+
```bash
12+
# create & enter folder
13+
mkdir mcp-agent && cd mcp-agent
14+
```
15+
2. scaffold with **uv**
16+
```bash
17+
uv init
18+
```
19+
3. install **OpenAI Agents SDK** + **dotenv**
20+
```bash
21+
uv add openai-agents python-dotenv
22+
```
23+
4. Create a `.env` file with your OpenAI key:
24+
```bash
25+
echo -n "OPENAI_API_KEY=<YOUR_OPENAI_API_KEY>" > .env
26+
```
27+
28+
### Write the Agent Client (main.py)
29+
```python
30+
import asyncio, os
31+
from dotenv import load_dotenv
32+
33+
# disable Agents SDK tracing for cleaner output
34+
os.environ["OPENAI_AGENTS_DISABLE_TRACING"] = "1"
35+
load_dotenv()
36+
37+
from agents import Agent, Runner, set_default_openai_key
38+
from agents.mcp import MCPServerSse
39+
from agents.model_settings import ModelSettings
40+
41+
async def run(mcp_server: list[MCPServer]):
42+
set_default_openai_key(os.getenv("OPENAI_API_KEY"))
43+
44+
agent = Agent(
45+
model="gpt-4.1-2025-04-14",
46+
name="Assistant",
47+
instructions="Use the tools to answer the user's query",
48+
mcp_servers=mcp_server,
49+
model_settings=ModelSettings(tool_choice="required"),
50+
)
51+
52+
for message in ["What is the CPU temperature?", "How is the weather in Cambridge?"]:
53+
print(f"Running: {message}")
54+
result = await Runner.run(starting_agent=agent, input=message)
55+
print(f"Response: {result.final_output}")
56+
57+
async def main():
58+
# replace URL with your ngrok-generated endpoint
59+
url = "<YOUR_NGROK_URL>/sse"
60+
61+
async with MCPServerSse(
62+
name="RPI5 MCP Server",
63+
params={"url": url},
64+
client_session_timeout_seconds=60,
65+
) as server1:
66+
await run([server1])
67+
68+
if __name__ == "__main__":
69+
asyncio.run(main())
70+
```
71+
72+
### Execute the Agent
73+
```bash
74+
uv run main.py
75+
```
76+
You should see output like:
77+
```output
78+
Running: What is the CPU temperature?
79+
Response: The current CPU temperature is 48.8°C.
80+
Running: How is the weather in Cambridge?
81+
The weather in Cambridge is currently partly cloudy with a temperature of around 10°C. The wind is blowing at approximately 17 km/h. Visibility is good at 10 km, and there is no precipitation expected at the moment. The weather should be pleasant throughout the day with temperatures rising to about 15°C by midday.
82+
```
83+
84+
Congratulations! Your local AI Agent just called the MCP server on your Raspberry Pi and fetched the CPU temperature and the weather information.
85+
86+
This lightweight protocol isn’t just a game-changer for LLM developers—it also empowers IoT engineers to transform real-world data streams and give AI direct, reliable control over any connected device.
87+
88+
### Next Steps
89+
- **Expand Your Toolset**
90+
- Write additional `@mcp.tool()` functions for Pi peripherals (GPIO pins, camera, I²C sensors, etc.)
91+
- Combine multiple MCP servers (e.g. filesystem, web-scraper, vector-store memory) for richer context
92+
93+
- **Integrate with IoT Platforms**
94+
- Hook into Home Assistant or Node-RED via MCP
95+
- Trigger real-world actions (turn on LEDs, read environmental sensors, control relays)
Lines changed: 115 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,115 @@
1+
---
2+
title: Set Up an MCP Server on Your Raspberry Pi
3+
weight: 3
4+
5+
### FIXED, DO NOT MODIFY
6+
layout: learningpathall
7+
---
8+
9+
## Expose Raspberry Pi MCP Server via ngrok
10+
11+
This guide shows you how to:
12+
13+
1. Install **uv** (the Rust-powered Python manager)
14+
2. Bootstrap a simple **MCP** server on your Raspberry Pi that reads the CPU temperature and searches the weather data
15+
3. Expose it to the internet with **ngrok**
16+
17+
### Prerequisites
18+
19+
- A **Raspberry Pi 5** (or other ARMv8 Pi) running Raspberry Pi OS (64-bit)
20+
- Basic familiarity with Python and the terminal
21+
22+
23+
#### 1. Install uv
24+
On Raspberry Pi Terminal:
25+
```bash
26+
curl -LsSf https://astral.sh/uv/install.sh | sh
27+
```
28+
29+
{{% notice Note %}}
30+
After the script finishes, restart your terminal so that the uv command is on your PATH.
31+
{{% /notice %}}
32+
33+
#### 2. Bootstrap the MCP Project
34+
1. Create a project directory and enter it:
35+
```bash
36+
mkdir mcp
37+
cd mcp
38+
```
39+
2. Initialize with uv (this creates pyproject.toml, .venv/, etc.):
40+
```bash
41+
uv init
42+
```
43+
3. Install the dependencies:
44+
```uv
45+
uv pip install fastmcp==2.2.10
46+
uv add requests
47+
```
48+
49+
#### 3. Write Your MCP Server (server.py)
50+
1. Create the server file:
51+
```bash
52+
touch server.py
53+
```
54+
2. Edit `server.py` with the following contents:
55+
```bash
56+
import subprocess, re
57+
from mcp.server.fastmcp import FastMCP
58+
59+
mcp = FastMCP("RaspberryPi MCP Server")
60+
61+
@mcp.tool()
62+
def read_temp():
63+
"""
64+
Description: Raspberry Pi's CPU core temperature
65+
Return: Temperature in °C (float)
66+
"""
67+
print(f"[debug-server] read_temp()")
68+
69+
out = subprocess.check_output(["vcgencmd", "measure_temp"]).decode()
70+
temp_c = float(re.search(r"[-\d.]+", out).group())
71+
return temp_c
72+
73+
@mcp.tool()
74+
def get_current_weather(city: str) -> str:
75+
"""
76+
Description: Get Current Weather data in the {city}
77+
Args:
78+
city: Name of the city
79+
Return: Current weather data of the city
80+
"""
81+
print(f"[debug-server] get_current_weather({city})")
82+
83+
endpoint = "https://wttr.in"
84+
response = requests.get(f"{endpoint}/{city}")
85+
return response.text
86+
87+
if __name__ == "__main__":
88+
mcp.run(transport="sse")
89+
```
90+
91+
#### 4. Run the MCP Server
92+
```python
93+
uv run server.py
94+
```
95+
By default, FastMCP will listen on port **8000** and serve your tools via **Server-Sent Events (SSE)**.
96+
97+
#### 5. Install & Configure ngrok
98+
1. Add ngrok’s APT repo and install:
99+
```bash
100+
curl -sSL https://ngrok-agent.s3.amazonaws.com/ngrok.asc \
101+
| sudo tee /etc/apt/trusted.gpg.d/ngrok.asc >/dev/null \
102+
&& echo "deb https://ngrok-agent.s3.amazonaws.com buster main" \
103+
| sudo tee /etc/apt/sources.list.d/ngrok.list \
104+
&& sudo apt update \
105+
&& sudo apt install ngrok
106+
```
107+
2. Authenticate your account:
108+
```bash
109+
ngrok config add-authtoken <YOUR_NGROK_AUTHTOKEN>
110+
```
111+
3. Expose port 8000:
112+
```bash
113+
ngrok http 8000
114+
```
115+
4. Copy the generated HTTPS URL (e.g. `https://abcd1234.ngrok-free.app`)—you’ll use this as your MCP endpoint.
211 KB
Loading

0 commit comments

Comments
 (0)