You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: content/learning-paths/cross-platform/mcp-ai-agent/_index.md
+11-12Lines changed: 11 additions & 12 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,24 +1,23 @@
1
1
---
2
-
title: Deploy an MCP server on a Raspberry Pi 5 and interact with it using an AI agent
2
+
title: Deploy an MCP Server on Raspberry Pi 5 for AI Agent Interaction using OpenAI SDK
3
+
3
4
4
-
draft: true
5
-
cascade:
6
-
draft: true
7
5
8
6
minutes_to_complete: 30
9
7
10
-
who_is_this_for: This Learning Path targets LLM and IoT developers who are familiar with Large Language Model (LLM) concepts and networking. You will learn how to deploy a lightweight Model Context Protocol (MCP) server on a Raspberry Pi 5 and interact with it via the OpenAI-Agent SDK.
8
+
who_is_this_for: This Learning Path is for LLM and IoT developers who want to run and interact with AI agents on edge devices like the Raspberry Pi 5. You'll learn how to deploy a lightweight Model Context Protocol (MCP) server and use the OpenAI Agent SDK to create and register tools for intelligent local inference.
11
9
12
10
learning_objectives:
13
-
- Deploy a lightweight Model Context Protocol (MCP) server on Raspberry Pi 5
14
-
- Design and register custom tools for the AI Agent
15
-
- Create custom endpoints
16
-
- Learn about uv — a fast, efficient Python package manager
11
+
- Deploy a lightweight Model Context Protocol (MCP) server on Raspberry Pi 5 for local AI agent execution.
12
+
- Use the OpenAI Agent SDK to interact with a local AI agent.
13
+
- Design and register custom tools for the agent tasks.
14
+
- Learn about uv — a fast, efficient Python package manager for efficient local deployment.
17
15
18
16
prerequisites:
19
-
- A [Raspberry Pi 5](https://www.raspberrypi.com/products/raspberry-pi-5/)
20
-
- Basic understanding of Python and prompt engineering.
21
-
- Understanding of LLM and AI Agent fundamentals
17
+
- A [Raspberry Pi 5](https://www.raspberrypi.com/products/raspberry-pi-5/) with a Linux-based OS installed.
18
+
- Familiarity with Python programming and prompt engineering techniques.
19
+
- Basic understanding of Large Language Models (LLMs) and how they are used in local inference.
20
+
- Understanding of AI agents and the OpenAI Agent SDK (or similar frameworks).
title: Introduction to Model Context Protocol and uv
2
+
title: Introduction to Model Context Protocol (MCP) and Python uv package for local AI agents
3
3
weight: 2
4
4
5
5
### FIXED, DO NOT MODIFY
@@ -8,27 +8,50 @@ layout: learningpathall
8
8
9
9
## Model Context Protocol (MCP)
10
10
11
-
The **Model Context Protocol (MCP)** is an open specification for wiring Large-Language-Model (LLM) agents to the *context* they need — whether that context is a database, a local sensor, or a SaaS API.
12
-
Think of it as USB-C for AI: once a tool or data source speaks MCP, any compliant LLM client can “plug in” and start using it immediately.
11
+
The Model Context Protocol (MCP) is an open specification designed to connect Large Language Model (LLM) agents to the context they need — including local sensors, databases, and SaaS APIs. It enables on-device AI agents to interact with real-world data through a plug-and-play protocol that works with any LLM framework, including the OpenAI Agent SDK.
13
12
14
13
### Why use MCP?
15
-
-**Plug-and-play integrations:**A growing catalog of pre-built MCP servers (filesystem, shell, vector stores, web-scraping, etc.) gives your agent instant super-powers with zero custom glue code.
14
+
-**Plug-and-play integrations:**a growing catalog of pre-built MCP servers (such as filesystem, shell, vector stores, and web-scraping) gives your agent instant superpowers - no custom integration or glue code required.
16
15
17
-
-**Model/vendor agnostic:**Because the protocol lives outside the model, you can swap models like GPT-4, Claude, or your own fine-tuned model without touching the integration layer.
16
+
-**Model/vendor agnostic:**as the protocol lives outside the model, you can swap models like GPT-4, Claude, or your own fine-tuned model without touching the integration layer.
18
17
19
-
-**Security by design:** MCP encourages running servers inside your own infrastructure, so sensitive data never leaves the perimeter unless you choose.
18
+
-**Security by design:** MCP encourages running servers inside your own infrastructure, so sensitive data stays within your infrastructure unless explicitly shared.
20
19
21
-
-**Cross-ecosystem momentum:**Recent roll-outs—from an official C# SDK to Wix’s production MCP server and Microsoft’s Azure support—show the MCP spec is gathering real-world traction.
20
+
-**Cross-ecosystem momentum:**recent roll-outsfrom an official C# SDK to Wix’s production MCP server and Microsoft’s Azure supportshow the MCP spec is gathering real-world traction.
22
21
23
-
### High-level architecture
24
-

25
-
-**MCP Host:** the LLM-powered application (Claude Desktop, an IDE plugin, OpenAI Agents SDK, etc.).
26
-
-**MCP Client:** the runtime shim that keeps a 1-to-1 connection with each server.
27
-
-**MCP Server:** a lightweight process that advertises tools (functions) over MCP.
28
-
-**Local data sources:** files, databases, or sensors your server can read directly.
29
-
-**Remote services:** external APIs the server can call on the host’s behalf.
22
+
## What is uv?
30
23
31
-
{{% notice Note %}}
32
-
Learn more about AI Agents in the [AI Agent on CPU learning path](https://learn.arm.com/learning-paths/servers-and-cloud-computing/ai-agent-on-cpu/).
24
+
`uv` is a fast, Rust-built Python package manager that simplifies dependency management. It's designed for speed and reliability, making it ideal for setting up local AI agent environments on constrained or embedded devices like the Raspberry Pi 5.
25
+
26
+
Some key features:
27
+
- Built in Rust for performance.
28
+
- Resolves dependencies and installs packages in one step.
29
+
- Optimized for local LLM workloads, embedded AI systems, and containerized Python environments.
30
+
31
+
For further information on `uv`, see: [https://github.com/astral-sh/uv](https://github.com/astral-sh/uv).
32
+
33
+
34
+
### A high-level view of the architecture
35
+
36
+

37
+
38
+
*Figure: High-level view of the architecture of the Model Context Protocol (MCP) for local AI agent integration with real-world data sources.*
39
+
40
+
Each component in the diagram plays a distinct role in enabling AI agents to interact with real-world context:
41
+
42
+
- The **MCP Host** is the LLM-powered application (such as Claude Desktop, an IDE plugin, or an application built with the OpenAI Agents SDK).
43
+
- The **MCP Client** is the runtime shim that keeps a 1-to-1 connection with each server.
44
+
- The **MCP Server** is a lightweight process that advertises tools (functions) over MCP.
45
+
- The **Local data sources** are files, databases, or sensors your server can read directly.
46
+
- The **Remote services** are external APIs the server can call on the host’s behalf.
47
+
48
+
{{% notice Learning Tip %}}
49
+
Learn more about AI Agents in the Learning Path [Deploy an AI Agent on Arm with llama.cpp and llama-cpp-agent using KleidiAI](https://learn.arm.com/learning-paths/servers-and-cloud-computing/ai-agent-on-cpu/).
33
50
{{% /notice %}}
34
51
52
+
### Section summary
53
+
54
+
This page introduces MCP and `uv` as foundational tools for building fast, secure, and modular AI agents that run efficiently on edge devices like the Raspberry Pi 5.
Copy file name to clipboardExpand all lines: content/learning-paths/cross-platform/mcp-ai-agent/mcp-client.md
+25-17Lines changed: 25 additions & 17 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,14 +1,14 @@
1
1
---
2
-
title: Build & Run an AI Agent on your development machine
2
+
title: Build and run an AI agent on your development machine
3
3
weight: 4
4
4
5
5
### FIXED, DO NOT MODIFY
6
6
layout: learningpathall
7
7
---
8
8
9
-
In this section you will learn how to setup an AI Agent on your development machine. You will then connect your MCP server running on the Raspberry Pi 5 to it.
9
+
In this section, you'll learn how to set up an AI Agent on your development machine. You will then connect your MCP server running on the Raspberry Pi 5 to it.
10
10
11
-
These commands were tested on an Linux Arm development machine.
11
+
These commands were tested on a Linux Arm development machine.
12
12
13
13
### Create an AI Agent and point it at your Pi's MCP Server
14
14
1. Install `uv` on your development machine:
@@ -20,22 +20,23 @@ curl -LsSf https://astral.sh/uv/install.sh | sh
- Write additional `@mcp.tool()` functions for Pi peripherals (such as GPIO pins, camera, and I²C sensors).
117
+
- Combine multiple MCP servers (for example, filesystem, web-scraper, and vector-store memory) for richer context.
114
118
115
119
-**Integrate with IoT Platforms**
116
-
- Hook into Home Assistant or Node-RED via MCP
117
-
- Trigger real-world actions (turn on LEDs, read environmental sensors, control relays)
120
+
- Hook into Home Assistant or Node-RED through MCP.
121
+
- Trigger real-world actions (for example, turn on LEDs, read environmental sensors, and control relays).
122
+
123
+
### Section summary
124
+
You’ve now built and run an AI agent on your development machine that connects to an MCP server on your Raspberry Pi 5. Your agent can now interact with real-world data sources in real time — a complete edge-to-cloud loop powered by OpenAI’s Agent SDK and the MCP protocol.
Copy file name to clipboardExpand all lines: content/learning-paths/cross-platform/mcp-ai-agent/mcp-server.md
+26-21Lines changed: 26 additions & 21 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,62 +1,62 @@
1
1
---
2
-
title: Set Up an MCP Server on Your Raspberry Pi
2
+
title: Set up an MCP server on Raspberry Pi 5
3
3
weight: 3
4
4
5
5
### FIXED, DO NOT MODIFY
6
6
layout: learningpathall
7
7
---
8
8
9
-
## Setup an MCP Server on Raspberry Pi 5
9
+
## Set up a FastMCP server on Raspberry Pi 5 with uv and ngrok
10
10
11
11
In this section you will learn how to:
12
12
13
-
1. Install uv (the Rust-powered Python package manager)
14
-
2. Bootstrap a simple MCP server on your Raspberry Pi 5 that reads the CPU temperature and searches the weather data
15
-
3. Expose the MCP server to the internet with **ngrok**
13
+
1. Install uv (the Rust-powered Python package manager).
14
+
2. Bootstrap a simple MCP server on your Raspberry Pi 5 that reads the CPU temperature and searches the weather data.
15
+
3. Expose the local MCP server to the internet using ngrok (HTTPS tunneling service).
16
16
17
-
You will run all the commands shown below on your Raspberry Pi 5 running Raspberry Pi OS (64-bit)
17
+
You will run all the commands shown below on your Raspberry Pi 5 running Raspberry Pi OS (64-bit).
18
18
19
19
#### 1. Install uv
20
-
On Raspberry Pi Terminal, install `uv`:
20
+
In your Raspberry Pi Terminal, install `uv`:
21
21
```bash
22
22
curl -LsSf https://astral.sh/uv/install.sh | sh
23
23
```
24
24
25
-
**uv** is a next-generation, Rust-based package manager that unifies pip, virtualenv, Poetry, and more—offering 10×–100× faste
26
-
r installs, built-in virtual environment handling, robust lockfiles, and full compatibility with the Python ecosystem.
25
+
`uv` is a Rust-based, next-generation Python package manager that replaces tools like `pip`, `virtualenv`, and Poetry. It delivers 10×–100× faster installs along with built-in virtual environments, lockfile support, and full Python ecosystem compatibility.
27
26
28
27
{{% notice Note %}}
29
28
After the script finishes, restart your terminal so that the uv command is on your PATH.
INFO: Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)
112
112
```
113
113
114
-
#### 5. Install & Configure ngrok
114
+
#### 5. Install and configure ngrok
115
115
116
116
You will now use ngrok to expose your locally running MCP server to the public internet over HTTPS.
117
117
@@ -136,4 +136,9 @@ Replace `YOUR_NGROK_AUTHTOKEN` with your token from the ngrok dashboard.
136
136
```bash
137
137
ngrok http 8000
138
138
```
139
-
4. Copy the generated HTTPS URL (e.g. `https://abcd1234.ngrok-free.app`)—you’ll use this as your MCP endpoint.
139
+
4. Copy the generated HTTPS URL (e.g. `https://abcd1234.ngrok-free.app`). You’ll use this endpoint to connect external tools or agents to your MCP server. Keep this URL available forthe next stepsin your workflow.
140
+
141
+
### Section summary
142
+
143
+
You now have a working FastMCP server on your Raspberry Pi 5. It includes tools for reading CPU temperature and retrieving weather data, and it's accessible over the internet via a public HTTPS endpoint using ngrok. This sets the stage for integration with LLM agents or other external tools.
0 commit comments