Skip to content

Commit a780fda

Browse files
Merge pull request #1991 from ArmDeveloperEcosystem/main
production update
2 parents 67145ce + c8b69f9 commit a780fda

File tree

9 files changed

+223
-84
lines changed

9 files changed

+223
-84
lines changed

content/learning-paths/cross-platform/mcp-ai-agent/_index.md

Lines changed: 12 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -1,35 +1,32 @@
11
---
2-
title: Deploy an MCP server on a Raspberry Pi 5 and interact with it using an AI agent
2+
title: Deploy an MCP Server on Raspberry Pi 5 for AI Agent Interaction using OpenAI SDK
33

4-
draft: true
5-
cascade:
6-
draft: true
7-
84
minutes_to_complete: 30
95

10-
who_is_this_for: This Learning Path targets LLM and IoT developers who are familiar with Large Language Model (LLM) concepts and networking. You will learn how to deploy a lightweight Model Context Protocol (MCP) server on a Raspberry Pi 5 and interact with it via the OpenAI-Agent SDK.
6+
who_is_this_for: This Learning Path is for LLM and IoT developers who want to run and interact with AI agents on edge devices like the Raspberry Pi 5. You'll learn how to deploy a lightweight Model Context Protocol (MCP) server and use the OpenAI Agent SDK to create and register tools for intelligent local inference.
117

128
learning_objectives:
13-
- Deploy a lightweight Model Context Protocol (MCP) server on Raspberry Pi 5
14-
- Design and register custom tools for the AI Agent
15-
- Create custom endpoints
16-
- Learn about uv — a fast, efficient Python package manager
9+
- Deploy a lightweight Model Context Protocol (MCP) server on Raspberry Pi 5 for local AI agent execution.
10+
- Use the OpenAI Agent SDK to interact with a local AI agent.
11+
- Design and register custom tools for the agent tasks.
12+
- Learn about uv — a fast, efficient Python package manager for efficient local deployment.
1713

1814
prerequisites:
19-
- A [Raspberry Pi 5](https://www.raspberrypi.com/products/raspberry-pi-5/)
20-
- Basic understanding of Python and prompt engineering.
21-
- Understanding of LLM and AI Agent fundamentals
15+
- A [Raspberry Pi 5](https://www.raspberrypi.com/products/raspberry-pi-5/) with a Linux-based OS installed.
16+
- Familiarity with Python programming and prompt engineering techniques.
17+
- Basic understanding of Large Language Models (LLMs) and how they are used in local inference.
18+
- Understanding of AI agents and the OpenAI Agent SDK (or similar frameworks).
2219

2320
author: Andrew Choi
2421

2522
skilllevels: Introductory
2623
subjects: ML
2724
armips:
28-
- Cortex-A76
25+
- Cortex-A
2926
tools_software_languages:
3027
- Python
31-
- IoT
3228
- AI
29+
- Raspberry Pi
3330
- MCP
3431

3532
operatingsystems:
Lines changed: 39 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
---
2-
title: Introduction to Model Context Protocol and uv
2+
title: Introduction to Model Context Protocol (MCP) and Python uv package for local AI agents
33
weight: 2
44

55
### FIXED, DO NOT MODIFY
@@ -8,27 +8,50 @@ layout: learningpathall
88

99
## Model Context Protocol (MCP)
1010

11-
The **Model Context Protocol (MCP)** is an open specification for wiring Large-Language-Model (LLM) agents to the *context* they need — whether that context is a database, a local sensor, or a SaaS API.
12-
Think of it as USB-C for AI: once a tool or data source speaks MCP, any compliant LLM client can “plug in” and start using it immediately.
11+
The Model Context Protocol (MCP) is an open specification designed to connect Large Language Model (LLM) agents to the context they need — including local sensors, databases, and SaaS APIs. It enables on-device AI agents to interact with real-world data through a plug-and-play protocol that works with any LLM framework, including the OpenAI Agent SDK.
1312

1413
### Why use MCP?
15-
- **Plug-and-play integrations:** A growing catalog of pre-built MCP servers (filesystem, shell, vector stores, web-scraping, etc.) gives your agent instant super-powers with zero custom glue code.
14+
- **Plug-and-play integrations:** a growing catalog of pre-built MCP servers (such as filesystem, shell, vector stores, and web-scraping) gives your agent instant superpowers - no custom integration or glue code required.
1615

17-
- **Model/vendor agnostic:** Because the protocol lives outside the model, you can swap models like GPT-4, Claude, or your own fine-tuned model without touching the integration layer.
16+
- **Model/vendor agnostic:** as the protocol lives outside the model, you can swap models like GPT-4, Claude, or your own fine-tuned model without touching the integration layer.
1817

19-
- **Security by design:** MCP encourages running servers inside your own infrastructure, so sensitive data never leaves the perimeter unless you choose.
18+
- **Security by design:** MCP encourages running servers inside your own infrastructure, so sensitive data stays within your infrastructure unless explicitly shared.
2019

21-
- **Cross-ecosystem momentum:** Recent roll-outsfrom an official C# SDK to Wix’s production MCP server and Microsoft’s Azure supportshow the MCP spec is gathering real-world traction.
20+
- **Cross-ecosystem momentum:** recent roll-outs from an official C# SDK to Wix’s production MCP server and Microsoft’s Azure support show the MCP spec is gathering real-world traction.
2221

23-
### High-level architecture
24-
![mcp server](./mcp.png)
25-
- **MCP Host:** the LLM-powered application (Claude Desktop, an IDE plugin, OpenAI Agents SDK, etc.).
26-
- **MCP Client:** the runtime shim that keeps a 1-to-1 connection with each server.
27-
- **MCP Server:** a lightweight process that advertises tools (functions) over MCP.
28-
- **Local data sources:** files, databases, or sensors your server can read directly.
29-
- **Remote services:** external APIs the server can call on the host’s behalf.
22+
## What is uv?
3023

31-
{{% notice Note %}}
32-
Learn more about AI Agents in the [AI Agent on CPU learning path](https://learn.arm.com/learning-paths/servers-and-cloud-computing/ai-agent-on-cpu/).
24+
`uv` is a fast, Rust-built Python package manager that simplifies dependency management. It's designed for speed and reliability, making it ideal for setting up local AI agent environments on constrained or embedded devices like the Raspberry Pi 5.
25+
26+
Some key features:
27+
- Built in Rust for performance.
28+
- Resolves dependencies and installs packages in one step.
29+
- Optimized for local LLM workloads, embedded AI systems, and containerized Python environments.
30+
31+
For further information on `uv`, see: [https://github.com/astral-sh/uv](https://github.com/astral-sh/uv).
32+
33+
34+
## A high-level view of the architecture
35+
36+
![Diagram of Model Context Protocol (MCP) architecture showing the interaction between MCP Host (LLM-powered app), MCP Client (runtime shim), and MCP Server, which connects to local data sources (files, sensors, databases) and remote APIs for AI agent context retrieval.](./mcp.png)
37+
38+
*Figure: High-level view of the architecture of the Model Context Protocol (MCP) for local AI agent integration with real-world data sources.*
39+
40+
Each component in the diagram plays a distinct role in enabling AI agents to interact with real-world context:
41+
42+
- The **MCP Host** is the LLM-powered application (such as Claude Desktop, an IDE plugin, or an application built with the OpenAI Agents SDK).
43+
- The **MCP Client** is the runtime shim that keeps a 1-to-1 connection with each server.
44+
- The **MCP Server** is a lightweight process that advertises tools (functions) over MCP.
45+
- The **Local data sources** are files, databases, or sensors your server can read directly.
46+
- The **Remote services** are external APIs the server can call on the host’s behalf.
47+
48+
{{% notice Learning Tip %}}
49+
Learn more about AI Agents in the Learning Path [Deploy an AI Agent on Arm with llama.cpp and llama-cpp-agent using KleidiAI](https://learn.arm.com/learning-paths/servers-and-cloud-computing/ai-agent-on-cpu/).
3350
{{% /notice %}}
3451

52+
## Section summary
53+
54+
This page introduces MCP and `uv` as foundational tools for building fast, secure, and modular AI agents that run efficiently on edge devices like the Raspberry Pi 5.
55+
56+
57+

content/learning-paths/cross-platform/mcp-ai-agent/mcp-client.md

Lines changed: 32 additions & 21 deletions
Original file line numberDiff line numberDiff line change
@@ -1,16 +1,17 @@
11
---
2-
title: Build & Run an AI Agent on your development machine
2+
title: Build and run an AI agent on your development machine
33
weight: 4
44

55
### FIXED, DO NOT MODIFY
66
layout: learningpathall
77
---
88

9-
In this section you will learn how to setup an AI Agent on your development machine. You will then connect your MCP server running on the Raspberry Pi 5 to it.
9+
In this section, you'll learn how to set up an AI Agent on your development machine. You will then connect your MCP server running on the Raspberry Pi 5 to it.
1010

11-
These commands were tested on an Linux Arm development machine.
11+
These commands were tested on a Linux Arm development machine.
12+
13+
## Create an AI Agent and point it at your Pi's MCP Server
1214

13-
### Create an AI Agent and point it at your Pi's MCP Server
1415
1. Install `uv` on your development machine:
1516

1617
```bash
@@ -20,27 +21,28 @@ curl -LsSf https://astral.sh/uv/install.sh | sh
2021
```bash
2122
mkdir mcp-agent && cd mcp-agent
2223
```
23-
3. Setup the directory to use `uv`:
24+
3. Set up the directory to use `uv`:
2425
```bash
2526
uv init
2627
```
2728

2829
This command adds:
29-
- .venv/ (auto-created virtual environment)
30-
- pyproject.toml (project metadata & dependencies)
31-
- .python-version (pinned interpreter)
32-
- README.md, .gitignore, and a sample main.py
30+
- .venv/ (auto-created virtual environment).
31+
- pyproject.toml (project metadata and dependencies).
32+
- .python-version (pinned interpreter).
33+
- README.md, .gitignore, and a sample main.py.
3334

34-
4. Install **OpenAI Agents SDK** + **dotenv**
35+
4. Install **OpenAI Agents SDK** + **dotenv**:
3536
```bash
3637
uv add openai-agents python-dotenv
3738
```
38-
5. Create a `.env` file with your OpenAI key:
39+
5. Create a `.env` file to securely store your OpenAI API key:
40+
3941
```bash
4042
echo -n "OPENAI_API_KEY=<YOUR_OPENAI_API_KEY>" > .env
4143
```
4244

43-
### Write the Python script for the Agent Client
45+
## Write the Python script for the Agent Client
4446

4547
Use a file editor of your choice and replace the content of the sample `main.py` with the content shown below:
4648

@@ -87,15 +89,18 @@ if __name__ == "__main__":
8789
asyncio.run(main())
8890
```
8991

90-
### Execute the Agent
92+
## Execute the Agent
93+
94+
You’re now ready to run the AI Agent and test its connection to your running MCP server on the Raspberry Pi 5.
9195

92-
You are now ready to the run the agent and test it with your running MCP server:
96+
Run the `main.py` Python script:
9397

94-
Run the `main.py` python script:
9598
```bash
9699
uv run main.py
97100
```
98-
The output should look like:
101+
102+
The output should look something like this:
103+
99104
```output
100105
Running: What is the CPU temperature?
101106
Response: The current CPU temperature is 48.8°C.
@@ -107,11 +112,17 @@ Congratulations! Your local AI Agent just called the MCP server on your Raspberr
107112

108113
This lightweight protocol isn’t just a game-changer for LLM developers—it also empowers IoT engineers to transform real-world data streams and give AI direct, reliable control over any connected device.
109114

110-
### Next Steps
115+
## Next Steps
116+
111117
- **Expand Your Toolset**
112-
- Write additional `@mcp.tool()` functions for Pi peripherals (GPIO pins, camera, I²C sensors, etc.)
113-
- Combine multiple MCP servers (e.g. filesystem, web-scraper, vector-store memory) for richer context
118+
- Write additional `@mcp.tool()` functions for Pi peripherals (such as GPIO pins, camera, and I²C sensors).
119+
- Combine multiple MCP servers (for example, filesystem, web-scraper, and vector-store memory) for richer context.
114120

115121
- **Integrate with IoT Platforms**
116-
- Hook into Home Assistant or Node-RED via MCP
117-
- Trigger real-world actions (turn on LEDs, read environmental sensors, control relays)
122+
- Hook into Home Assistant or Node-RED through MCP.
123+
- Trigger real-world actions (for example, turn on LEDs, read environmental sensors, and control relays).
124+
125+
## Section summary
126+
127+
You’ve now built and run an AI agent on your development machine that connects to an MCP server on your Raspberry Pi 5. Your agent can now interact with real-world data sources in real time — a complete edge-to-cloud loop powered by OpenAI’s Agent SDK and the MCP protocol.
128+

content/learning-paths/cross-platform/mcp-ai-agent/mcp-server.md

Lines changed: 26 additions & 21 deletions
Original file line numberDiff line numberDiff line change
@@ -1,62 +1,62 @@
11
---
2-
title: Set Up an MCP Server on Your Raspberry Pi
2+
title: Set up an MCP server on Raspberry Pi 5
33
weight: 3
44

55
### FIXED, DO NOT MODIFY
66
layout: learningpathall
77
---
88

9-
## Setup an MCP Server on Raspberry Pi 5
9+
## Set up a FastMCP server on Raspberry Pi 5 with uv and ngrok
1010

1111
In this section you will learn how to:
1212

13-
1. Install uv (the Rust-powered Python package manager)
14-
2. Bootstrap a simple MCP server on your Raspberry Pi 5 that reads the CPU temperature and searches the weather data
15-
3. Expose the MCP server to the internet with **ngrok**
13+
1. Install uv (the Rust-powered Python package manager).
14+
2. Bootstrap a simple MCP server on your Raspberry Pi 5 that reads the CPU temperature and searches the weather data.
15+
3. Expose the local MCP server to the internet using ngrok (HTTPS tunneling service).
1616

17-
You will run all the commands shown below on your Raspberry Pi 5 running Raspberry Pi OS (64-bit)
17+
You will run all the commands shown below on your Raspberry Pi 5 running Raspberry Pi OS (64-bit).
1818

1919
#### 1. Install uv
20-
On Raspberry Pi Terminal, install `uv`:
20+
In your Raspberry Pi Terminal, install `uv`:
2121
```bash
2222
curl -LsSf https://astral.sh/uv/install.sh | sh
2323
```
2424

25-
**uv** is a next-generation, Rust-based package manager that unifies pip, virtualenv, Poetry, and more—offering 10×–100× faste
26-
r installs, built-in virtual environment handling, robust lockfiles, and full compatibility with the Python ecosystem.
25+
`uv` is a Rust-based, next-generation Python package manager that replaces tools like `pip`, `virtualenv`, and Poetry. It delivers 10×–100× faster installs along with built-in virtual environments, lockfile support, and full Python ecosystem compatibility.
2726

2827
{{% notice Note %}}
2928
After the script finishes, restart your terminal so that the uv command is on your PATH.
3029
{{% /notice %}}
3130

3231
#### 2. Bootstrap the MCP Project
33-
1. Create a project directory and enter it:
32+
1. Create a project directory and navigate to it:
3433
```bash
3534
mkdir mcp
3635
cd mcp
3736
```
38-
2. Initialize with `uv`:
37+
2. Initialize `uv`:
3938
```bash
4039
uv init
4140
```
4241
This command adds:
43-
- .venv/ (auto-created virtual environment)
44-
- pyproject.toml (project metadata & dependencies)
45-
- .python-version (pinned interpreter)
42+
- .venv/ (auto-created virtual environment).
43+
- pyproject.toml (project metadata and dependencies).
44+
- .python-version (pinned interpreter).
4645
- README.md, .gitignore, and a sample main.py
4746

48-
3. Install the dependencies:
47+
3. Install the dependencies (learn more about [FastMCP](https://github.com/jlowin/fastmcp)):
48+
4949
```bash
5050
uv pip install fastmcp==2.2.10
5151
uv add requests
5252
```
5353

5454
#### 3. Build your MCP Server
55-
1. Create a python file for your MCP server named `server.py`:
55+
1. Create a Python file for your MCP server named `server.py`:
5656
```bash
5757
touch server.py
5858
```
59-
2. Use a file editor of your choice and copy the following content into `server.py`:
59+
2. Open server.py in your preferred text editor and paste in the following code:
6060
```bash
6161
import subprocess, re
6262
from mcp.server.fastmcp import FastMCP
@@ -95,12 +95,12 @@ if __name__ == "__main__":
9595
9696
#### 4. Run the MCP Server
9797
98-
Run the python script to deploy the MCP server:
98+
Run the Python script to deploy the MCP server:
9999
100100
```python
101101
uv run server.py
102102
```
103-
By default, FastMCP will listen on port 8000 and serve your tools via Server-Sent Events (SSE).
103+
By default, FastMCP listens on port 8000 and exposes your registered tools over HTTP using Server-Sent Events (SSE).
104104
105105
The output should look like:
106106
@@ -111,7 +111,7 @@ INFO: Application startup complete.
111111
INFO: Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)
112112
```
113113
114-
#### 5. Install & Configure ngrok
114+
#### 5. Install and configure ngrok
115115
116116
You will now use ngrok to expose your locally running MCP server to the public internet over HTTPS.
117117
@@ -136,4 +136,9 @@ Replace `YOUR_NGROK_AUTHTOKEN` with your token from the ngrok dashboard.
136136
```bash
137137
ngrok http 8000
138138
```
139-
4. Copy the generated HTTPS URL (e.g. `https://abcd1234.ngrok-free.app`)—you’ll use this as your MCP endpoint.
139+
4. Copy the generated HTTPS URL (e.g. `https://abcd1234.ngrok-free.app`). You’ll use this endpoint to connect external tools or agents to your MCP server. Keep this URL available for the next steps in your workflow.
140+
141+
## Section summary
142+
143+
You now have a working FastMCP server on your Raspberry Pi 5. It includes tools for reading CPU temperature and retrieving weather data, and it's accessible over the internet via a public HTTPS endpoint using ngrok. This sets the stage for integration with LLM agents or other external tools.
144+

content/learning-paths/embedded-and-microcontrollers/_index.md

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -61,7 +61,6 @@ tools_software_languages_filter:
6161
- GitHub: 3
6262
- GitLab: 1
6363
- Himax SDK: 1
64-
- IoT: 1
6564
- IP Explorer: 4
6665
- Jupyter Notebook: 1
6766
- K3s: 1
@@ -80,7 +79,7 @@ tools_software_languages_filter:
8079
- Python: 6
8180
- PyTorch: 2
8281
- QEMU: 1
83-
- Raspberry Pi: 5
82+
- Raspberry Pi: 6
8483
- Remote.It: 1
8584
- RTX: 2
8685
- Runbook: 4

content/learning-paths/iot/_index.md

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -32,11 +32,10 @@ tools_software_languages_filter:
3232
- Docker: 2
3333
- Fixed Virtual Platform: 1
3434
- GitHub: 3
35-
- IoT: 1
3635
- Matter: 1
3736
- MCP: 1
3837
- Python: 2
39-
- Raspberry Pi: 2
38+
- Raspberry Pi: 3
4039
- Remote.It: 1
4140
- VS Code: 1
4241
---

data/stats_current_test_info.yml

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
summary:
2-
content_total: 366
2+
content_total: 369
33
content_with_all_tests_passing: 0
44
content_with_tests_enabled: 61
55
sw_categories:
@@ -63,8 +63,7 @@ sw_categories:
6363
tests_and_status: []
6464
aws-q-cli:
6565
readable_title: Amazon Q Developer CLI
66-
tests_and_status:
67-
- ubuntu:latest: passed
66+
tests_and_status: []
6867
azure-cli:
6968
readable_title: Azure CLI
7069
tests_and_status: []

0 commit comments

Comments
 (0)