Skip to content

Commit fb10a84

Browse files
Merge pull request #1984 from madeline-underwood/mcp
Mcp_JA to review
2 parents 4b3e867 + c48f693 commit fb10a84

File tree

4 files changed

+101
-66
lines changed

4 files changed

+101
-66
lines changed

content/learning-paths/cross-platform/mcp-ai-agent/_index.md

Lines changed: 11 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -1,24 +1,23 @@
11
---
2-
title: Deploy an MCP server on a Raspberry Pi 5 and interact with it using an AI agent
2+
title: Deploy an MCP Server on Raspberry Pi 5 for AI Agent Interaction using OpenAI SDK
3+
34

4-
draft: true
5-
cascade:
6-
draft: true
75

86
minutes_to_complete: 30
97

10-
who_is_this_for: This Learning Path targets LLM and IoT developers who are familiar with Large Language Model (LLM) concepts and networking. You will learn how to deploy a lightweight Model Context Protocol (MCP) server on a Raspberry Pi 5 and interact with it via the OpenAI-Agent SDK.
8+
who_is_this_for: This Learning Path is for LLM and IoT developers who want to run and interact with AI agents on edge devices like the Raspberry Pi 5. You'll learn how to deploy a lightweight Model Context Protocol (MCP) server and use the OpenAI Agent SDK to create and register tools for intelligent local inference.
119

1210
learning_objectives:
13-
- Deploy a lightweight Model Context Protocol (MCP) server on Raspberry Pi 5
14-
- Design and register custom tools for the AI Agent
15-
- Create custom endpoints
16-
- Learn about uv — a fast, efficient Python package manager
11+
- Deploy a lightweight Model Context Protocol (MCP) server on Raspberry Pi 5 for local AI agent execution.
12+
- Use the OpenAI Agent SDK to interact with a local AI agent.
13+
- Design and register custom tools for the agent tasks.
14+
- Learn about uv — a fast, efficient Python package manager for efficient local deployment.
1715

1816
prerequisites:
19-
- A [Raspberry Pi 5](https://www.raspberrypi.com/products/raspberry-pi-5/)
20-
- Basic understanding of Python and prompt engineering.
21-
- Understanding of LLM and AI Agent fundamentals
17+
- A [Raspberry Pi 5](https://www.raspberrypi.com/products/raspberry-pi-5/) with a Linux-based OS installed.
18+
- Familiarity with Python programming and prompt engineering techniques.
19+
- Basic understanding of Large Language Models (LLMs) and how they are used in local inference.
20+
- Understanding of AI agents and the OpenAI Agent SDK (or similar frameworks).
2221

2322
author: Andrew Choi
2423

Lines changed: 39 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
---
2-
title: Introduction to Model Context Protocol and uv
2+
title: Introduction to Model Context Protocol (MCP) and Python uv package for local AI agents
33
weight: 2
44

55
### FIXED, DO NOT MODIFY
@@ -8,27 +8,50 @@ layout: learningpathall
88

99
## Model Context Protocol (MCP)
1010

11-
The **Model Context Protocol (MCP)** is an open specification for wiring Large-Language-Model (LLM) agents to the *context* they need — whether that context is a database, a local sensor, or a SaaS API.
12-
Think of it as USB-C for AI: once a tool or data source speaks MCP, any compliant LLM client can “plug in” and start using it immediately.
11+
The Model Context Protocol (MCP) is an open specification designed to connect Large Language Model (LLM) agents to the context they need — including local sensors, databases, and SaaS APIs. It enables on-device AI agents to interact with real-world data through a plug-and-play protocol that works with any LLM framework, including the OpenAI Agent SDK.
1312

1413
### Why use MCP?
15-
- **Plug-and-play integrations:** A growing catalog of pre-built MCP servers (filesystem, shell, vector stores, web-scraping, etc.) gives your agent instant super-powers with zero custom glue code.
14+
- **Plug-and-play integrations:** a growing catalog of pre-built MCP servers (such as filesystem, shell, vector stores, and web-scraping) gives your agent instant superpowers - no custom integration or glue code required.
1615

17-
- **Model/vendor agnostic:** Because the protocol lives outside the model, you can swap models like GPT-4, Claude, or your own fine-tuned model without touching the integration layer.
16+
- **Model/vendor agnostic:** as the protocol lives outside the model, you can swap models like GPT-4, Claude, or your own fine-tuned model without touching the integration layer.
1817

19-
- **Security by design:** MCP encourages running servers inside your own infrastructure, so sensitive data never leaves the perimeter unless you choose.
18+
- **Security by design:** MCP encourages running servers inside your own infrastructure, so sensitive data stays within your infrastructure unless explicitly shared.
2019

21-
- **Cross-ecosystem momentum:** Recent roll-outsfrom an official C# SDK to Wix’s production MCP server and Microsoft’s Azure supportshow the MCP spec is gathering real-world traction.
20+
- **Cross-ecosystem momentum:** recent roll-outs from an official C# SDK to Wix’s production MCP server and Microsoft’s Azure support show the MCP spec is gathering real-world traction.
2221

23-
### High-level architecture
24-
![mcp server](./mcp.png)
25-
- **MCP Host:** the LLM-powered application (Claude Desktop, an IDE plugin, OpenAI Agents SDK, etc.).
26-
- **MCP Client:** the runtime shim that keeps a 1-to-1 connection with each server.
27-
- **MCP Server:** a lightweight process that advertises tools (functions) over MCP.
28-
- **Local data sources:** files, databases, or sensors your server can read directly.
29-
- **Remote services:** external APIs the server can call on the host’s behalf.
22+
## What is uv?
3023

31-
{{% notice Note %}}
32-
Learn more about AI Agents in the [AI Agent on CPU learning path](https://learn.arm.com/learning-paths/servers-and-cloud-computing/ai-agent-on-cpu/).
24+
`uv` is a fast, Rust-built Python package manager that simplifies dependency management. It's designed for speed and reliability, making it ideal for setting up local AI agent environments on constrained or embedded devices like the Raspberry Pi 5.
25+
26+
Some key features:
27+
- Built in Rust for performance.
28+
- Resolves dependencies and installs packages in one step.
29+
- Optimized for local LLM workloads, embedded AI systems, and containerized Python environments.
30+
31+
For further information on `uv`, see: [https://github.com/astral-sh/uv](https://github.com/astral-sh/uv).
32+
33+
34+
### A high-level view of the architecture
35+
36+
![Diagram of Model Context Protocol (MCP) architecture showing the interaction between MCP Host (LLM-powered app), MCP Client (runtime shim), and MCP Server, which connects to local data sources (files, sensors, databases) and remote APIs for AI agent context retrieval.](./mcp.png)
37+
38+
*Figure: High-level view of the architecture of the Model Context Protocol (MCP) for local AI agent integration with real-world data sources.*
39+
40+
Each component in the diagram plays a distinct role in enabling AI agents to interact with real-world context:
41+
42+
- The **MCP Host** is the LLM-powered application (such as Claude Desktop, an IDE plugin, or an application built with the OpenAI Agents SDK).
43+
- The **MCP Client** is the runtime shim that keeps a 1-to-1 connection with each server.
44+
- The **MCP Server** is a lightweight process that advertises tools (functions) over MCP.
45+
- The **Local data sources** are files, databases, or sensors your server can read directly.
46+
- The **Remote services** are external APIs the server can call on the host’s behalf.
47+
48+
{{% notice Learning Tip %}}
49+
Learn more about AI Agents in the Learning Path [Deploy an AI Agent on Arm with llama.cpp and llama-cpp-agent using KleidiAI](https://learn.arm.com/learning-paths/servers-and-cloud-computing/ai-agent-on-cpu/).
3350
{{% /notice %}}
3451

52+
### Section summary
53+
54+
This page introduces MCP and `uv` as foundational tools for building fast, secure, and modular AI agents that run efficiently on edge devices like the Raspberry Pi 5.
55+
56+
57+

content/learning-paths/cross-platform/mcp-ai-agent/mcp-client.md

Lines changed: 25 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -1,14 +1,14 @@
11
---
2-
title: Build & Run an AI Agent on your development machine
2+
title: Build and run an AI agent on your development machine
33
weight: 4
44

55
### FIXED, DO NOT MODIFY
66
layout: learningpathall
77
---
88

9-
In this section you will learn how to setup an AI Agent on your development machine. You will then connect your MCP server running on the Raspberry Pi 5 to it.
9+
In this section, you'll learn how to set up an AI Agent on your development machine. You will then connect your MCP server running on the Raspberry Pi 5 to it.
1010

11-
These commands were tested on an Linux Arm development machine.
11+
These commands were tested on a Linux Arm development machine.
1212

1313
### Create an AI Agent and point it at your Pi's MCP Server
1414
1. Install `uv` on your development machine:
@@ -20,22 +20,23 @@ curl -LsSf https://astral.sh/uv/install.sh | sh
2020
```bash
2121
mkdir mcp-agent && cd mcp-agent
2222
```
23-
3. Setup the directory to use `uv`:
23+
3. Set up the directory to use `uv`:
2424
```bash
2525
uv init
2626
```
2727

2828
This command adds:
29-
- .venv/ (auto-created virtual environment)
30-
- pyproject.toml (project metadata & dependencies)
31-
- .python-version (pinned interpreter)
32-
- README.md, .gitignore, and a sample main.py
29+
- .venv/ (auto-created virtual environment).
30+
- pyproject.toml (project metadata and dependencies).
31+
- .python-version (pinned interpreter).
32+
- README.md, .gitignore, and a sample main.py.
3333

34-
4. Install **OpenAI Agents SDK** + **dotenv**
34+
4. Install **OpenAI Agents SDK** + **dotenv**:
3535
```bash
3636
uv add openai-agents python-dotenv
3737
```
38-
5. Create a `.env` file with your OpenAI key:
38+
5. Create a `.env` file to securely store your OpenAI API key:
39+
3940
```bash
4041
echo -n "OPENAI_API_KEY=<YOUR_OPENAI_API_KEY>" > .env
4142
```
@@ -89,13 +90,16 @@ if __name__ == "__main__":
8990

9091
### Execute the Agent
9192

92-
You are now ready to the run the agent and test it with your running MCP server:
93+
You’re now ready to run the AI Agent and test its connection to your running MCP server on the Raspberry Pi 5.
94+
95+
Run the `main.py` Python script:
9396

94-
Run the `main.py` python script:
9597
```bash
9698
uv run main.py
9799
```
98-
The output should look like:
100+
101+
The output should look something like this:
102+
99103
```output
100104
Running: What is the CPU temperature?
101105
Response: The current CPU temperature is 48.8°C.
@@ -109,9 +113,13 @@ This lightweight protocol isn’t just a game-changer for LLM developers—it al
109113

110114
### Next Steps
111115
- **Expand Your Toolset**
112-
- Write additional `@mcp.tool()` functions for Pi peripherals (GPIO pins, camera, I²C sensors, etc.)
113-
- Combine multiple MCP servers (e.g. filesystem, web-scraper, vector-store memory) for richer context
116+
- Write additional `@mcp.tool()` functions for Pi peripherals (such as GPIO pins, camera, and I²C sensors).
117+
- Combine multiple MCP servers (for example, filesystem, web-scraper, and vector-store memory) for richer context.
114118

115119
- **Integrate with IoT Platforms**
116-
- Hook into Home Assistant or Node-RED via MCP
117-
- Trigger real-world actions (turn on LEDs, read environmental sensors, control relays)
120+
- Hook into Home Assistant or Node-RED through MCP.
121+
- Trigger real-world actions (for example, turn on LEDs, read environmental sensors, and control relays).
122+
123+
### Section summary
124+
You’ve now built and run an AI agent on your development machine that connects to an MCP server on your Raspberry Pi 5. Your agent can now interact with real-world data sources in real time — a complete edge-to-cloud loop powered by OpenAI’s Agent SDK and the MCP protocol.
125+

content/learning-paths/cross-platform/mcp-ai-agent/mcp-server.md

Lines changed: 26 additions & 21 deletions
Original file line numberDiff line numberDiff line change
@@ -1,62 +1,62 @@
11
---
2-
title: Set Up an MCP Server on Your Raspberry Pi
2+
title: Set up an MCP server on Raspberry Pi 5
33
weight: 3
44

55
### FIXED, DO NOT MODIFY
66
layout: learningpathall
77
---
88

9-
## Setup an MCP Server on Raspberry Pi 5
9+
## Set up a FastMCP server on Raspberry Pi 5 with uv and ngrok
1010

1111
In this section you will learn how to:
1212

13-
1. Install uv (the Rust-powered Python package manager)
14-
2. Bootstrap a simple MCP server on your Raspberry Pi 5 that reads the CPU temperature and searches the weather data
15-
3. Expose the MCP server to the internet with **ngrok**
13+
1. Install uv (the Rust-powered Python package manager).
14+
2. Bootstrap a simple MCP server on your Raspberry Pi 5 that reads the CPU temperature and searches the weather data.
15+
3. Expose the local MCP server to the internet using ngrok (HTTPS tunneling service).
1616

17-
You will run all the commands shown below on your Raspberry Pi 5 running Raspberry Pi OS (64-bit)
17+
You will run all the commands shown below on your Raspberry Pi 5 running Raspberry Pi OS (64-bit).
1818

1919
#### 1. Install uv
20-
On Raspberry Pi Terminal, install `uv`:
20+
In your Raspberry Pi Terminal, install `uv`:
2121
```bash
2222
curl -LsSf https://astral.sh/uv/install.sh | sh
2323
```
2424

25-
**uv** is a next-generation, Rust-based package manager that unifies pip, virtualenv, Poetry, and more—offering 10×–100× faste
26-
r installs, built-in virtual environment handling, robust lockfiles, and full compatibility with the Python ecosystem.
25+
`uv` is a Rust-based, next-generation Python package manager that replaces tools like `pip`, `virtualenv`, and Poetry. It delivers 10×–100× faster installs along with built-in virtual environments, lockfile support, and full Python ecosystem compatibility.
2726

2827
{{% notice Note %}}
2928
After the script finishes, restart your terminal so that the uv command is on your PATH.
3029
{{% /notice %}}
3130

3231
#### 2. Bootstrap the MCP Project
33-
1. Create a project directory and enter it:
32+
1. Create a project directory and navigate to it:
3433
```bash
3534
mkdir mcp
3635
cd mcp
3736
```
38-
2. Initialize with `uv`:
37+
2. Initialize `uv`:
3938
```bash
4039
uv init
4140
```
4241
This command adds:
43-
- .venv/ (auto-created virtual environment)
44-
- pyproject.toml (project metadata & dependencies)
45-
- .python-version (pinned interpreter)
42+
- .venv/ (auto-created virtual environment).
43+
- pyproject.toml (project metadata and dependencies).
44+
- .python-version (pinned interpreter).
4645
- README.md, .gitignore, and a sample main.py
4746

48-
3. Install the dependencies:
47+
3. Install the dependencies (learn more about [FastMCP](https://github.com/jlowin/fastmcp)):
48+
4949
```bash
5050
uv pip install fastmcp==2.2.10
5151
uv add requests
5252
```
5353

5454
#### 3. Build your MCP Server
55-
1. Create a python file for your MCP server named `server.py`:
55+
1. Create a Python file for your MCP server named `server.py`:
5656
```bash
5757
touch server.py
5858
```
59-
2. Use a file editor of your choice and copy the following content into `server.py`:
59+
2. Open server.py in your preferred text editor and paste in the following code:
6060
```bash
6161
import subprocess, re
6262
from mcp.server.fastmcp import FastMCP
@@ -95,12 +95,12 @@ if __name__ == "__main__":
9595
9696
#### 4. Run the MCP Server
9797
98-
Run the python script to deploy the MCP server:
98+
Run the Python script to deploy the MCP server:
9999
100100
```python
101101
uv run server.py
102102
```
103-
By default, FastMCP will listen on port 8000 and serve your tools via Server-Sent Events (SSE).
103+
By default, FastMCP listens on port 8000 and exposes your registered tools over HTTP using Server-Sent Events (SSE).
104104
105105
The output should look like:
106106
@@ -111,7 +111,7 @@ INFO: Application startup complete.
111111
INFO: Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)
112112
```
113113
114-
#### 5. Install & Configure ngrok
114+
#### 5. Install and configure ngrok
115115
116116
You will now use ngrok to expose your locally running MCP server to the public internet over HTTPS.
117117
@@ -136,4 +136,9 @@ Replace `YOUR_NGROK_AUTHTOKEN` with your token from the ngrok dashboard.
136136
```bash
137137
ngrok http 8000
138138
```
139-
4. Copy the generated HTTPS URL (e.g. `https://abcd1234.ngrok-free.app`)—you’ll use this as your MCP endpoint.
139+
4. Copy the generated HTTPS URL (e.g. `https://abcd1234.ngrok-free.app`). You’ll use this endpoint to connect external tools or agents to your MCP server. Keep this URL available for the next steps in your workflow.
140+
141+
### Section summary
142+
143+
You now have a working FastMCP server on your Raspberry Pi 5. It includes tools for reading CPU temperature and retrieving weather data, and it's accessible over the internet via a public HTTPS endpoint using ngrok. This sets the stage for integration with LLM agents or other external tools.
144+

0 commit comments

Comments
 (0)