Skip to content

Commit 74fee9c

Browse files
committed
Merge remote-tracking branch 'origin/main' into pr/60
2 parents 64067c9 + 45465e1 commit 74fee9c

File tree

4 files changed

+128
-148
lines changed

4 files changed

+128
-148
lines changed

units/en/unit1/mcp-clients.mdx

Lines changed: 88 additions & 83 deletions
Original file line numberDiff line numberDiff line change
@@ -12,6 +12,14 @@ In this section, you will:
1212
* Discover how to use Hugging Face's MCP Client implementation
1313
* See practical examples of MCP Client usage
1414

15+
<Tip>
16+
17+
In this page we're going to show examples of how to set up MCP Clients in a few different ways using the JSON notation. For now, we will use *examples* like `path/to/server.py` to represent the path to the MCP Server. In the next unit, we'll implement this with real MCP Servers.
18+
19+
For now, focus on understanding the MCP Client notation. We'll implement the MCP Servers in the next unit.
20+
21+
</Tip>
22+
1523
## Understanding MCP Clients
1624

1725
MCP Clients are crucial components that act as the bridge between AI applications (Hosts) and external capabilities provided by MCP Servers. Think of the Host as your main application (like an AI assistant or IDE) and the Client as a specialized module within that Host responsible for handling MCP communications.
@@ -52,6 +60,8 @@ Fortunately, the configuration files are very simple, easy to understand, and co
5260

5361
The standard configuration file for MCP is named `mcp.json`. Here's the basic structure:
5462

63+
This is the basic structure of the `mcp.json` can be passed to applications like Claude Desktop, Cursor, or VS Code.
64+
5565
```json
5666
{
5767
"servers": [
@@ -162,7 +172,7 @@ The corresponding configuration in `mcp.json` would look like this:
162172
"transport": {
163173
"type": "stdio",
164174
"command": "python",
165-
"args": ["/path/to/github_server.py"],
175+
"args": ["/path/to/github_server.py"], // This is an example, we'll use a real server in the next unit
166176
"env": {
167177
"GITHUB_TOKEN": "your_github_token"
168178
}
@@ -188,7 +198,7 @@ In this scenario, we have a local server that is a Python script which could be
188198
"transport": {
189199
"type": "stdio",
190200
"command": "python",
191-
"args": ["/path/to/file_explorer_server.py"]
201+
"args": ["/path/to/file_explorer_server.py"] // This is an example, we'll use a real server in the next unit
192202
}
193203
}
194204
]
@@ -206,7 +216,7 @@ In this scenario, we have a remote server that is a weather API.
206216
"name": "Weather API",
207217
"transport": {
208218
"type": "sse",
209-
"url": "https://example.com/mcp-server"
219+
"url": "https://example.com/mcp-server" // This is an example, we'll use a real server in the next unit
210220
}
211221
}
212222
]
@@ -217,122 +227,117 @@ Proper configuration is essential for successfully deploying MCP integrations. B
217227

218228
In the next section, we'll explore the ecosystem of MCP servers available on Hugging Face Hub and how to publish your own servers there.
219229

220-
## Code Clients
230+
## Tiny Agents Clients
221231

222-
You can also use the MCP Client within code so that the tools are available to the LLM. Let's explore some examples in `smolagents`. To run these examples you will need to add `mcp[cli]`, `smolagents[toolkit]`, and `smolagents[mcp]` to your `uv` virtual environment.
232+
Now, let's explore how to use MCP Clients within code.
223233

224-
First, let's explore our weather server from the previous page. In `smolagents`, we can use the `ToolCollection` class to automatically discover and register tools from an MCP server. This is done by passing the `StdioServerParameters` or `SSEServerParameters` to the `ToolCollection.from_mcp` method. We can then print the tools to the console.
234+
You can also use tiny agents as MCP Clients to connect directly to MCP servers from your code. Tiny agents provide a simple way to create AI agents that can use tools from MCP servers.
225235

226-
```python
227-
from smolagents import ToolCollection
228-
from mcp.client.stdio import StdioServerParameters
229-
230-
server_parameters = StdioServerParameters(command="uv", args=["run", "server.py"])
236+
Tiny Agent can run MCP servers with a command line environment. To do this, we will need to install `npm` and run the server with `npx`. **We'll need these for both Python and JavaScript.**
231237

232-
with ToolCollection.from_mcp(
233-
server_parameters, trust_remote_code=True
234-
) as tools:
235-
print("\n".join(f"{tool.name}: {tool.description}" for tool in tools.tools))
238+
Let's install `npx` with `npm`. If you don't have `npm` installed, check out the [npm documentation](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm).
236239

240+
```bash
241+
# install npx
242+
npm install -g npx
237243
```
238244

239-
<details>
240-
<summary>
241-
Output
242-
</summary>
245+
<hfoptions id="language">
246+
<hfoption id="python">
243247

244-
```sh
245-
get_weather: Get the current weather for a specified location.
248+
First, install the tiny agents package:
246249

250+
```bash
251+
pip install "huggingface_hub[mcp]>=0.32.0"
247252
```
248253

249-
</details>
250-
251-
We can also connect to an MCP server that is hosted on a remote machine. In this case, we need to pass the `SSEServerParameters` to the `MCPClient` class.
254+
First, let's login to the Hugging Face Hub. You will need a [login token](https://huggingface.co/docs/huggingface_hub/v0.32.3/en/quick-start#authentication) to do this.
252255

253-
```python
254-
from smolagents.mcp_client import MCPClient
255-
256-
with MCPClient(
257-
{"url": "https://abidlabs-mcp-tools.hf.space/gradio_api/mcp/sse"}
258-
) as tools:
259-
# Tools from the remote server are available
260-
print("\n".join(f"{t.name}: {t.description}" for t in tools))
256+
```bash
257+
huggingface-cli login
261258
```
262259

263-
<details>
264-
<summary>
265-
Output
266-
</summary>
260+
Now, let's create an agent configuration file `agent.json`.
267261

268-
```sh
269-
prime_factors: Compute the prime factorization of a positive integer.
270-
generate_cheetah_image: Generate a cheetah image.
271-
image_orientation: Returns whether image is portrait or landscape.
272-
sepia: Apply a sepia filter to the input image.
262+
```json
263+
{
264+
"model": "Qwen/Qwen2.5-72B-Instruct",
265+
"provider": "nebius",
266+
"servers": [
267+
{
268+
"type": "stdio",
269+
"config": {
270+
"command": "npx",
271+
"args": ["@playwright/mcp@latest"]
272+
}
273+
}
274+
]
275+
}
273276
```
274277

275-
</details>
278+
In this configuration, we are using the `@playwright/mcp` MCP server. This is a MCP server that can control a browser with Playwright.
276279

277-
Now, let's see how we can use the MCP Client in a code agent.
280+
Now you can run the agent:
278281

279-
```python
280-
from smolagents import InferenceClientModel, CodeAgent, ToolCollection
281-
from mcp.client.stdio import StdioServerParameters
282+
```bash
283+
tiny-agents run agent.json
284+
```
282285

283-
model = InferenceClientModel()
286+
<hfoption id="javascript">
284287

285-
server_parameters = StdioServerParameters(command="uv", args=["run", "server.py"])
288+
First, install the tiny agents package with [npm](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm).
286289

287-
with ToolCollection.from_mcp(
288-
server_parameters, trust_remote_code=True
289-
) as tool_collection:
290-
agent = CodeAgent(tools=[*tool_collection.tools], model=model)
291-
agent.run("What's the weather in Tokyo?")
290+
```bash
291+
npm install @huggingface/tiny-agents
292+
```
293+
294+
Make an agent project directory and create an `agent.json` file.
292295

296+
```bash
297+
mkdir my-agent
298+
touch my-agent/agent.json
293299
```
294300

295-
<details>
296-
<summary>
297-
Output
298-
</summary>
301+
Create an agent configuration file at `my-agent/agent.json`:
299302

300-
```sh
301-
The weather in Tokyo is sunny with a temperature of 20 degrees Celsius.
303+
```json
304+
{
305+
"model": "Qwen/Qwen2.5-72B-Instruct",
306+
"provider": "nebius",
307+
"servers": [
308+
{
309+
"type": "stdio",
310+
"config": {
311+
"command": "npx",
312+
"args": ["@playwright/mcp@latest"]
313+
}
314+
}
315+
]
316+
}
302317
```
303318

304-
</details>
319+
Now you can run the agent:
305320

306-
We can also connect to an MCP package. Here's an example of connecting to the `pubmedmcp` package.
321+
```bash
322+
npx @huggingface/tiny-agents run ./my-agent
323+
```
307324

308-
```python
309-
import os
310-
from smolagents import ToolCollection, CodeAgent, InferenceClientModel
311-
from mcp.client.stdio import StdioServerParameters
325+
</hfoption>
326+
</hfoptions>
312327

313-
model = InferenceClientModel()
328+
In the video below, we run the agent and ask it to open a new tab in the browser.
314329

315-
server_parameters = StdioServerParameters(
316-
command="uvx",
317-
args=["--quiet", "[email protected]"],
318-
env={"UV_PYTHON": "3.12", **os.environ},
319-
)
330+
The following example shows a web-browsing agent configured to use the [Qwen/Qwen2.5-72B-Instruct](https://huggingface.co/Qwen/Qwen2.5-72B-Instruct) model via Nebius inference provider, and it comes equipped with a playwright MCP server, which lets it use a web browser! The agent config is loaded specifying [its path in the `tiny-agents/tiny-agents`](https://huggingface.co/datasets/tiny-agents/tiny-agents/tree/main/celinah/web-browser) Hugging Face dataset.
320331

321-
with ToolCollection.from_mcp(server_parameters, trust_remote_code=True) as tool_collection:
322-
agent = CodeAgent(tools=[*tool_collection.tools], add_base_tools=True, model=model)
323-
agent.run("Please find a remedy for hangover.")
324-
```
332+
<video controls autoplay loop>
333+
<source src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/blog/python-tiny-agents/web_browser_agent.mp4" type="video/mp4">
334+
</video>
325335

326-
<details>
327-
<summary>
328-
Output
329-
</summary>
336+
When you run the agent, you'll see it load, listing the tools it has discovered from its connected MCP servers. Then, it's ready for your prompts!
330337

331-
```sh
332-
The remedy for hangover is to drink water.
333-
```
338+
Prompt used in this demo:
334339

335-
</details>
340+
> do a Web Search for HF inference providers on Brave Search and open the first result and then give me the list of the inference providers supported on Hugging Face
336341
337342
## Next Steps
338343

units/en/unit1/sdk.mdx

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -27,19 +27,19 @@ from mcp.server.fastmcp import FastMCP
2727
# Create an MCP server
2828
mcp = FastMCP("Weather Service")
2929

30-
30+
# Tool implementation
3131
@mcp.tool()
3232
def get_weather(location: str) -> str:
3333
"""Get the current weather for a specified location."""
3434
return f"Weather in {location}: Sunny, 72°F"
3535

36-
36+
# Resource implementation
3737
@mcp.resource("weather://{location}")
3838
def weather_resource(location: str) -> str:
3939
"""Provide weather data as a resource."""
4040
return f"Weather data for {location}: Sunny, 72°F"
4141

42-
42+
# Prompt implementation
4343
@mcp.prompt()
4444
def weather_report(location: str) -> str:
4545
"""Create a weather report prompt."""

units/en/unit2/gradio-client.mdx

Lines changed: 17 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ We'll connect to the MCP server we created in the previous section and use it to
1515
First, we need to install the `smolagents`, gradio and mcp-client libraries, if we haven't already:
1616

1717
```bash
18-
pip install "smolagents[mcp]" "gradio[mcp]" mcp
18+
pip install "smolagents[mcp]" "gradio[mcp]" mcp fastmcp
1919
```
2020

2121
Now, we can import the necessary libraries and create a simple Gradio interface that uses the MCP Client to connect to the MCP Server.
@@ -24,16 +24,15 @@ Now, we can import the necessary libraries and create a simple Gradio interface
2424
```python
2525
import gradio as gr
2626

27-
from mcp.client.stdio import StdioServerParameters
28-
from smolagents import InferenceClientModel, CodeAgent, ToolCollection
29-
from smolagents.mcp_client import MCPClient
27+
from mcp import StdioServerParameters
28+
from smolagents import InferenceClientModel, CodeAgent, ToolCollection, MCPClient
3029
```
3130

3231
Next, we'll connect to the MCP Server and get the tools that we can use to answer questions.
3332

3433
```python
3534
mcp_client = MCPClient(
36-
{"url": "http://localhost:7860/gradio_api/mcp/sse"}
35+
{"url": "http://localhost:7860/gradio_api/mcp/sse"} # This is the MCP Server we created in the previous section
3736
)
3837
tools = mcp_client.get_tools()
3938
```
@@ -77,16 +76,13 @@ Here's the complete example of the MCP Client in Gradio:
7776
```python
7877
import gradio as gr
7978

80-
from mcp.client.stdio import StdioServerParameters
81-
from smolagents import InferenceClientModel, CodeAgent, ToolCollection
82-
from smolagents.mcp_client import MCPClient
79+
from mcp import StdioServerParameters
80+
from smolagents import InferenceClientModel, CodeAgent, ToolCollection, MCPClient
8381

8482

8583
try:
8684
mcp_client = MCPClient(
87-
## Try this working example on the hub:
88-
# {"url": "https://abidlabs-mcp-tools.hf.space/gradio_api/mcp/sse"}
89-
{"url": "http://localhost:7860/gradio_api/mcp/sse"}
85+
{"url": "http://localhost:7860/gradio_api/mcp/sse"} # This is the MCP Server we created in the previous section
9086
)
9187
tools = mcp_client.get_tools()
9288

@@ -119,13 +115,21 @@ To deploy your Gradio MCP client to Hugging Face Spaces:
119115
- Choose "Gradio" as the SDK
120116
- Name your space (e.g., "mcp-client")
121117

122-
2. Create a `requirements.txt` file:
118+
2. Update MCP Server URL in the code:
119+
120+
```python
121+
mcp_client = MCPClient(
122+
{"url": "http://localhost:7860/gradio_api/mcp/sse"} # This is the MCP Server we created in the previous section
123+
)
124+
```
125+
126+
3. Create a `requirements.txt` file:
123127
```txt
124128
gradio[mcp]
125129
smolagents[mcp]
126130
```
127131

128-
3. Push your code to the Space:
132+
4. Push your code to the Space:
129133
```bash
130134
git init
131135
git add server.py requirements.txt

0 commit comments

Comments
 (0)