Skip to content

Commit 2cc3ef8

Browse files
committed
Merge branch 'release/unit3' into add-unit3-hfprbot
2 parents 015579e + 1e6a195 commit 2cc3ef8

File tree

9 files changed

+535
-662
lines changed

9 files changed

+535
-662
lines changed

units/en/_toctree.yml

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -25,6 +25,8 @@
2525
title: MCP Clients
2626
- local: unit1/gradio-mcp
2727
title: Gradio MCP Integration
28+
- local: unit1/unit1-recap
29+
title: Unit 1 Recap
2830
- local: unit1/certificate
2931
title: Get your certificate!
3032

@@ -39,7 +41,7 @@
3941
- local: unit2/gradio-client
4042
title: Building an MCP Client with Gradio
4143
- local: unit2/tiny-agents
42-
title: Building a Tiny Agent with TypeScript
44+
title: Building Tiny Agents with MCP and the Hugging Face Hub
4345

4446
- title: "3.1. Use Case: Build a Pull Request Agent on the Hub"
4547
sections:

units/en/unit1/mcp-clients.mdx

Lines changed: 97 additions & 82 deletions
Original file line numberDiff line numberDiff line change
@@ -12,6 +12,14 @@ In this section, you will:
1212
* Discover how to use Hugging Face's MCP Client implementation
1313
* See practical examples of MCP Client usage
1414

15+
<Tip>
16+
17+
In this page we're going to show examples of how to set up MCP Clients in a few different ways using the JSON notation. For now, we will use *examples* like `path/to/server.py` to represent the path to the MCP Server. In the next unit, we'll implement this with real MCP Servers.
18+
19+
For now, focus on understanding the MCP Client notation. We'll implement the MCP Servers in the next unit.
20+
21+
</Tip>
22+
1523
## Understanding MCP Clients
1624

1725
MCP Clients are crucial components that act as the bridge between AI applications (Hosts) and external capabilities provided by MCP Servers. Think of the Host as your main application (like an AI assistant or IDE) and the Client as a specialized module within that Host responsible for handling MCP communications.
@@ -52,6 +60,8 @@ Fortunately, the configuration files are very simple, easy to understand, and co
5260

5361
The standard configuration file for MCP is named `mcp.json`. Here's the basic structure:
5462

63+
This is the basic structure of the `mcp.json` can be passed to applications like Claude Desktop, Cursor, or VS Code.
64+
5565
```json
5666
{
5767
"servers": [
@@ -80,7 +90,7 @@ For local servers using stdio transport, the configuration includes the command
8090
"transport": {
8191
"type": "stdio",
8292
"command": "python",
83-
"args": ["/path/to/file_explorer_server.py"]
93+
"args": ["/path/to/file_explorer_server.py"] // This is an example, we'll use a real server in the next unit
8494
}
8595
}
8696
]
@@ -162,7 +172,7 @@ The corresponding configuration in `mcp.json` would look like this:
162172
"transport": {
163173
"type": "stdio",
164174
"command": "python",
165-
"args": ["/path/to/github_server.py"],
175+
"args": ["/path/to/github_server.py"], // This is an example, we'll use a real server in the next unit
166176
"env": {
167177
"GITHUB_TOKEN": "your_github_token"
168178
}
@@ -188,7 +198,7 @@ In this scenario, we have a local server that is a Python script which could be
188198
"transport": {
189199
"type": "stdio",
190200
"command": "python",
191-
"args": ["/path/to/file_explorer_server.py"]
201+
"args": ["/path/to/file_explorer_server.py"] // This is an example, we'll use a real server in the next unit
192202
}
193203
}
194204
]
@@ -206,7 +216,7 @@ In this scenario, we have a remote server that is a weather API.
206216
"name": "Weather API",
207217
"transport": {
208218
"type": "sse",
209-
"url": "https://example.com/mcp-server"
219+
"url": "https://example.com/mcp-server" // This is an example, we'll use a real server in the next unit
210220
}
211221
}
212222
]
@@ -217,120 +227,125 @@ Proper configuration is essential for successfully deploying MCP integrations. B
217227

218228
In the next section, we'll explore the ecosystem of MCP servers available on Hugging Face Hub and how to publish your own servers there.
219229

220-
## Code Clients
230+
## Tiny Agents Clients
221231

222-
You can also use the MCP Client in within code so that the tools are available to the LLM. Let's explore some examples in `smolagents`.
232+
Now, let's explore how to use MCP Clients within code.
223233

224-
First, let's explore our weather server from the previous page. In `smolagents`, we can use the `ToolCollection` class to automatically discover and register tools from an MCP server. This is done by passing the `StdioServerParameters` or `SSEServerParameters` to the `ToolCollection.from_mcp` method. We can then print the tools to the console.
234+
You can also use tiny agents as MCP Clients to connect directly to MCP servers from your code. Tiny agents provide a simple way to create AI agents that can use tools from MCP servers.
225235

226-
```python
227-
from smolagents import ToolCollection
228-
from mcp.client.stdio import StdioServerParameters
236+
Tiny Agent can run MCP servers with a command line environment. To do this, we will need to install `npm` and run the server with `npx`. **We'll need these for both Python and JavaScript.**
229237

230-
server_parameters = StdioServerParameters(command="uv", args=["run", "server.py"])
238+
Let's install `npx` with `npm`. If you don't have `npm` installed, check out the [npm documentation](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm).
231239

232-
with ToolCollection.from_mcp(
233-
server_parameters, trust_remote_code=True
234-
) as tools:
235-
print("\n".join(f"{tool.name}: {tool.description}" for tool in tools.tools))
240+
### Setup
236241

242+
First, we will need to install `npx` if you don't have it installed. You can do this with the following command:
243+
244+
```bash
245+
# install npx
246+
npm install -g npx
237247
```
238248

239-
<details>
240-
<summary>
241-
Output
242-
</summary>
249+
Then, we will need to install the huggingface_hub package with the MCP support. This will allow us to run MCP servers and clients.
250+
251+
```bash
252+
pip install "huggingface_hub[mcp]>=0.32.0"
253+
```
243254

244-
```sh
245-
get_weather: Get the current weather for a specified location.
255+
Then, we will need to log in to the Hugging Face Hub to access the MCP servers. You can do this with the `huggingface-cli` command line tool. You will need a [login token](https://huggingface.co/docs/huggingface_hub/v0.32.3/en/quick-start#authentication) to do this.
246256

257+
```bash
258+
huggingface-cli login
247259
```
248260

249-
</details>
261+
<hfoptions id="language">
262+
<hfoption id="python">
250263

251-
We can also connect to an MCP server that is hosted on a remote machine. In this case, we need to pass the `SSEServerParameters` to the `MCPClient` class.
264+
### Connecting to MCP Servers
252265

253-
```python
254-
from smolagents.mcp_client import MCPClient
266+
Now, let's create an agent configuration file `agent.json`.
255267

256-
with MCPClient(
257-
{"url": "https://abidlabs-mcp-tools.hf.space/gradio_api/mcp/sse"}
258-
) as tools:
259-
# Tools from the remote server are available
260-
print("\n".join(f"{t.name}: {t.description}" for t in tools))
268+
```json
269+
{
270+
"model": "Qwen/Qwen2.5-72B-Instruct",
271+
"provider": "nebius",
272+
"servers": [
273+
{
274+
"type": "stdio",
275+
"config": {
276+
"command": "npx",
277+
"args": ["@playwright/mcp@latest"]
278+
}
279+
}
280+
]
281+
}
261282
```
262283

263-
<details>
264-
<summary>
265-
Output
266-
</summary>
284+
In this configuration, we are using the `@playwright/mcp` MCP server. This is a MCP server that can control a browser with Playwright.
285+
286+
Now you can run the agent:
267287

268-
```sh
269-
prime_factors: Compute the prime factorization of a positive integer.
270-
generate_cheetah_image: Generate a cheetah image.
271-
image_orientation: Returns whether image is portrait or landscape.
272-
sepia: Apply a sepia filter to the input image.
288+
```bash
289+
tiny-agents run agent.json
273290
```
291+
</hfoption>
292+
<hfoption id="javascript">
274293

275-
</details>
294+
First, install the tiny agents package with [npm](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm).
276295

277-
Now, let's see how we can use the MCP Client in a code agent.
296+
```bash
297+
npm install @huggingface/tiny-agents
298+
```
278299

279-
```python
280-
from smolagents import InferenceClientModel, CodeAgent, ToolCollection
281-
from mcp.client.stdio import StdioServerParameters
300+
### Connecting to MCP Servers
282301

283-
model = InferenceClientModel()
302+
Make an agent project directory and create an `agent.json` file.
284303

285-
server_parameters = StdioServerParameters(command="uv", args=["run", "server.py"])
304+
```bash
305+
mkdir my-agent
306+
touch my-agent/agent.json
307+
```
286308

287-
with ToolCollection.from_mcp(
288-
server_parameters, trust_remote_code=True
289-
) as tool_collection:
290-
agent = CodeAgent(tools=[*tool_collection.tools], model=model)
291-
agent.run("What's the weather in Tokyo?")
309+
Create an agent configuration file at `my-agent/agent.json`:
292310

311+
```json
312+
{
313+
"model": "Qwen/Qwen2.5-72B-Instruct",
314+
"provider": "nebius",
315+
"servers": [
316+
{
317+
"type": "stdio",
318+
"config": {
319+
"command": "npx",
320+
"args": ["@playwright/mcp@latest"]
321+
}
322+
}
323+
]
324+
}
293325
```
294326

295-
<details>
296-
<summary>
297-
Output
298-
</summary>
327+
Now you can run the agent:
299328

300-
```sh
301-
The weather in Tokyo is sunny with a temperature of 20 degrees Celsius.
329+
```bash
330+
npx @huggingface/tiny-agents run ./my-agent
302331
```
303332

304-
</details>
333+
</hfoption>
334+
</hfoptions>
305335

306-
We can also connect to an MCP package. Here's an example of connecting to the `pubmedmcp` package.
336+
In the video below, we run the agent and ask it to open a new tab in the browser.
307337

308-
```python
309-
import os
310-
from smolagents import ToolCollection, CodeAgent
311-
from mcp import StdioServerParameters
312-
313-
server_parameters = StdioServerParameters(
314-
command="uv",
315-
args=["--quiet", "[email protected]"],
316-
env={"UV_PYTHON": "3.12", **os.environ},
317-
)
318-
319-
with ToolCollection.from_mcp(server_parameters, trust_remote_code=True) as tool_collection:
320-
agent = CodeAgent(tools=[*tool_collection.tools], add_base_tools=True)
321-
agent.run("Please find a remedy for hangover.")
322-
```
338+
The following example shows a web-browsing agent configured to use the [Qwen/Qwen2.5-72B-Instruct](https://huggingface.co/Qwen/Qwen2.5-72B-Instruct) model via Nebius inference provider, and it comes equipped with a playwright MCP server, which lets it use a web browser! The agent config is loaded specifying [its path in the `tiny-agents/tiny-agents`](https://huggingface.co/datasets/tiny-agents/tiny-agents/tree/main/celinah/web-browser) Hugging Face dataset.
323339

324-
<details>
325-
<summary>
326-
Output
327-
</summary>
340+
<video controls autoplay loop>
341+
<source src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/blog/python-tiny-agents/web_browser_agent.mp4" type="video/mp4">
342+
</video>
328343

329-
```sh
330-
The remedy for hangover is to drink water.
331-
```
344+
When you run the agent, you'll see it load, listing the tools it has discovered from its connected MCP servers. Then, it's ready for your prompts!
345+
346+
Prompt used in this demo:
332347

333-
</details>
348+
> do a Web Search for HF inference providers on Brave Search and open the first result and then give me the list of the inference providers supported on Hugging Face
334349
335350
## Next Steps
336351

units/en/unit1/sdk.mdx

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -27,19 +27,19 @@ from mcp.server.fastmcp import FastMCP
2727
# Create an MCP server
2828
mcp = FastMCP("Weather Service")
2929

30-
30+
# Tool implementation
3131
@mcp.tool()
3232
def get_weather(location: str) -> str:
3333
"""Get the current weather for a specified location."""
3434
return f"Weather in {location}: Sunny, 72°F"
3535

36-
36+
# Resource implementation
3737
@mcp.resource("weather://{location}")
3838
def weather_resource(location: str) -> str:
3939
"""Provide weather data as a resource."""
4040
return f"Weather data for {location}: Sunny, 72°F"
4141

42-
42+
# Prompt implementation
4343
@mcp.prompt()
4444
def weather_report(location: str) -> str:
4545
"""Create a weather report prompt."""

units/en/unit1/unit1-recap.mdx

Lines changed: 45 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,45 @@
1+
# Unit1 recap
2+
3+
## Model Context Protocol (MCP)
4+
5+
The MCP is a standardized protocol designed to connect AI models with external tools, data sources, and environments. It addresses the limitations of existing AI systems by enabling interoperability and access to real-time information.
6+
7+
## Key Concepts
8+
9+
### Client-Server Architecture
10+
MCP follows a client-server model where clients manage communication between users and servers. This architecture promotes modularity, allowing for easy addition of new servers without requiring changes to existing hosts.
11+
12+
### Components
13+
#### Host
14+
The user-facing AI application that serves as the interface for end-users.
15+
16+
##### Client
17+
A component within the host application responsible for managing communication with a specific MCP server. Clients maintain 1:1 connections with servers and handle protocol-level details.
18+
19+
#### Server
20+
An external program or service that provides access to tools, data sources, or services via the MCP protocol. Servers act as lightweight wrappers around existing functionalities.
21+
22+
### Capabilities
23+
#### Tools
24+
Executable functions that can perform actions (e.g., sending messages, querying APIs). Tools are typically model-controlled and require user approval due to their ability to perform actions with side effects.
25+
26+
#### Resources
27+
Read-only data sources for context retrieval without significant computation. Resources are application-controlled and designed for data retrieval similar to GET endpoints in REST APIs.
28+
29+
#### Prompts
30+
Pre-defined templates or workflows that guide interactions between users, AI models, and available capabilities. Prompts are user-controlled and set the context for interactions.
31+
32+
#### Sampling
33+
Server-initiated requests for LLM processing, enabling server-driven agentic behaviors and potentially recursive or multi-step interactions. Sampling operations typically require user approval.
34+
35+
### Communication Protocol
36+
The MCP protocol uses JSON-RPC 2.0 as the message format for communication between clients and servers. Two primary transport mechanisms are supported: stdio (for local communication) and HTTP+SSE (for remote communication). Messages include requests, responses, and notifications.
37+
38+
### Discovery Process
39+
MCP allows clients to dynamically discover available tools, resources, and prompts through list methods (e.g., `tools/list`). This dynamic discovery mechanism enables clients to adapt to the specific capabilities each server offers without requiring hardcoded knowledge of server functionality.
40+
41+
### MCP SDKs
42+
Official SDKs are available in various programming languages for implementing MCP clients and servers. These SDKs handle protocol-level communication, capability registration, and error handling, simplifying the development process.
43+
44+
### Gradio Integration
45+
Gradio allows easy creation of web interfaces that expose capabilities to the MCP protocol, making it accessible for both humans and AI models. This integration provides a human-friendly interface alongside AI-accessible tools with minimal code.

0 commit comments

Comments
 (0)