Skip to content

Commit 860758e

Browse files
authored
Merge branch 'update-abidlabs-space-in-unit2' into main
2 parents 9d587a7 + 1578e8c commit 860758e

File tree

4 files changed

+167
-148
lines changed

4 files changed

+167
-148
lines changed

units/en/unit1/mcp-clients.mdx

Lines changed: 95 additions & 82 deletions
Original file line numberDiff line numberDiff line change
@@ -12,6 +12,14 @@ In this section, you will:
1212
* Discover how to use Hugging Face's MCP Client implementation
1313
* See practical examples of MCP Client usage
1414

15+
<Tip>
16+
17+
In this page we're going to show examples of how to set up MCP Clients in a few different ways using the JSON notation. For now, we will use *examples* like `path/to/server.py` to represent the path to the MCP Server. In the next unit, we'll implement this with real MCP Servers.
18+
19+
For now, focus on understanding the MCP Client notation. We'll implement the MCP Servers in the next unit.
20+
21+
</Tip>
22+
1523
## Understanding MCP Clients
1624

1725
MCP Clients are crucial components that act as the bridge between AI applications (Hosts) and external capabilities provided by MCP Servers. Think of the Host as your main application (like an AI assistant or IDE) and the Client as a specialized module within that Host responsible for handling MCP communications.
@@ -52,6 +60,8 @@ Fortunately, the configuration files are very simple, easy to understand, and co
5260

5361
The standard configuration file for MCP is named `mcp.json`. Here's the basic structure:
5462

63+
This is the basic structure of the `mcp.json` can be passed to applications like Claude Desktop, Cursor, or VS Code.
64+
5565
```json
5666
{
5767
"servers": [
@@ -80,7 +90,7 @@ For local servers using stdio transport, the configuration includes the command
8090
"transport": {
8191
"type": "stdio",
8292
"command": "python",
83-
"args": ["/path/to/file_explorer_server.py"]
93+
"args": ["/path/to/file_explorer_server.py"] // This is an example, we'll use a real server in the next unit
8494
}
8595
}
8696
]
@@ -162,7 +172,7 @@ The corresponding configuration in `mcp.json` would look like this:
162172
"transport": {
163173
"type": "stdio",
164174
"command": "python",
165-
"args": ["/path/to/github_server.py"],
175+
"args": ["/path/to/github_server.py"], // This is an example, we'll use a real server in the next unit
166176
"env": {
167177
"GITHUB_TOKEN": "your_github_token"
168178
}
@@ -188,7 +198,7 @@ In this scenario, we have a local server that is a Python script which could be
188198
"transport": {
189199
"type": "stdio",
190200
"command": "python",
191-
"args": ["/path/to/file_explorer_server.py"]
201+
"args": ["/path/to/file_explorer_server.py"] // This is an example, we'll use a real server in the next unit
192202
}
193203
}
194204
]
@@ -206,7 +216,7 @@ In this scenario, we have a remote server that is a weather API.
206216
"name": "Weather API",
207217
"transport": {
208218
"type": "sse",
209-
"url": "https://example.com/mcp-server"
219+
"url": "https://example.com/mcp-server" // This is an example, we'll use a real server in the next unit
210220
}
211221
}
212222
]
@@ -217,122 +227,125 @@ Proper configuration is essential for successfully deploying MCP integrations. B
217227

218228
In the next section, we'll explore the ecosystem of MCP servers available on Hugging Face Hub and how to publish your own servers there.
219229

220-
## Code Clients
230+
## Tiny Agents Clients
221231

222-
You can also use the MCP Client within code so that the tools are available to the LLM. Let's explore some examples in `smolagents`. To run these examples you will need to add `mcp[cli]`, `smolagents[toolkit]`, and `smolagents[mcp]` to your `uv` virtual environment.
232+
Now, let's explore how to use MCP Clients within code.
223233

224-
First, let's explore our weather server from the previous page. In `smolagents`, we can use the `ToolCollection` class to automatically discover and register tools from an MCP server. This is done by passing the `StdioServerParameters` or `SSEServerParameters` to the `ToolCollection.from_mcp` method. We can then print the tools to the console.
234+
You can also use tiny agents as MCP Clients to connect directly to MCP servers from your code. Tiny agents provide a simple way to create AI agents that can use tools from MCP servers.
225235

226-
```python
227-
from smolagents import ToolCollection
228-
from mcp.client.stdio import StdioServerParameters
236+
Tiny Agent can run MCP servers with a command line environment. To do this, we will need to install `npm` and run the server with `npx`. **We'll need these for both Python and JavaScript.**
229237

230-
server_parameters = StdioServerParameters(command="uv", args=["run", "server.py"])
238+
Let's install `npx` with `npm`. If you don't have `npm` installed, check out the [npm documentation](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm).
231239

232-
with ToolCollection.from_mcp(
233-
server_parameters, trust_remote_code=True
234-
) as tools:
235-
print("\n".join(f"{tool.name}: {tool.description}" for tool in tools.tools))
240+
### Setup
236241

242+
First, we will need to install `npx` if you don't have it installed. You can do this with the following command:
243+
244+
```bash
245+
# install npx
246+
npm install -g npx
237247
```
238248

239-
<details>
240-
<summary>
241-
Output
242-
</summary>
249+
Then, we will need to install the huggingface_hub package with the MCP support. This will allow us to run MCP servers and clients.
250+
251+
```bash
252+
pip install "huggingface_hub[mcp]>=0.32.0"
253+
```
243254

244-
```sh
245-
get_weather: Get the current weather for a specified location.
255+
Then, we will need to log in to the Hugging Face Hub to access the MCP servers. You can do this with the `huggingface-cli` command line tool. You will need a [login token](https://huggingface.co/docs/huggingface_hub/v0.32.3/en/quick-start#authentication) to do this.
246256

257+
```bash
258+
huggingface-cli login
247259
```
248260

249-
</details>
261+
<hfoptions id="language">
262+
<hfoption id="python">
250263

251-
We can also connect to an MCP server that is hosted on a remote machine. In this case, we need to pass the `SSEServerParameters` to the `MCPClient` class.
264+
### Connecting to MCP Servers
252265

253-
```python
254-
from smolagents.mcp_client import MCPClient
266+
Now, let's create an agent configuration file `agent.json`.
255267

256-
with MCPClient(
257-
{"url": "https://abidlabs-mcp-tools2.hf.space/gradio_api/mcp/sse"}
258-
) as tools:
259-
# Tools from the remote server are available
260-
print("\n".join(f"{t.name}: {t.description}" for t in tools))
268+
```json
269+
{
270+
"model": "Qwen/Qwen2.5-72B-Instruct",
271+
"provider": "nebius",
272+
"servers": [
273+
{
274+
"type": "stdio",
275+
"config": {
276+
"command": "npx",
277+
"args": ["@playwright/mcp@latest"]
278+
}
279+
}
280+
]
281+
}
261282
```
262283

263-
<details>
264-
<summary>
265-
Output
266-
</summary>
284+
In this configuration, we are using the `@playwright/mcp` MCP server. This is a MCP server that can control a browser with Playwright.
267285

268-
```sh
269-
prime_factors: Compute the prime factorization of a positive integer.
270-
generate_cheetah_image: Generate a cheetah image.
271-
image_orientation: Returns whether image is portrait or landscape.
272-
sepia: Apply a sepia filter to the input image.
273-
```
286+
Now you can run the agent:
274287

275-
</details>
288+
```bash
289+
tiny-agents run agent.json
290+
```
276291

277-
Now, let's see how we can use the MCP Client in a code agent.
292+
<hfoption id="javascript">
278293

279-
```python
280-
from smolagents import InferenceClientModel, CodeAgent, ToolCollection
281-
from mcp.client.stdio import StdioServerParameters
294+
First, install the tiny agents package with [npm](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm).
282295

283-
model = InferenceClientModel()
296+
```bash
297+
npm install @huggingface/tiny-agents
298+
```
284299

285-
server_parameters = StdioServerParameters(command="uv", args=["run", "server.py"])
300+
### Connecting to MCP Servers
286301

287-
with ToolCollection.from_mcp(
288-
server_parameters, trust_remote_code=True
289-
) as tool_collection:
290-
agent = CodeAgent(tools=[*tool_collection.tools], model=model)
291-
agent.run("What's the weather in Tokyo?")
302+
Make an agent project directory and create an `agent.json` file.
292303

304+
```bash
305+
mkdir my-agent
306+
touch my-agent/agent.json
293307
```
294308

295-
<details>
296-
<summary>
297-
Output
298-
</summary>
309+
Create an agent configuration file at `my-agent/agent.json`:
299310

300-
```sh
301-
The weather in Tokyo is sunny with a temperature of 20 degrees Celsius.
311+
```json
312+
{
313+
"model": "Qwen/Qwen2.5-72B-Instruct",
314+
"provider": "nebius",
315+
"servers": [
316+
{
317+
"type": "stdio",
318+
"config": {
319+
"command": "npx",
320+
"args": ["@playwright/mcp@latest"]
321+
}
322+
}
323+
]
324+
}
302325
```
303326

304-
</details>
327+
Now you can run the agent:
305328

306-
We can also connect to an MCP package. Here's an example of connecting to the `pubmedmcp` package.
329+
```bash
330+
npx @huggingface/tiny-agents run ./my-agent
331+
```
307332

308-
```python
309-
import os
310-
from smolagents import ToolCollection, CodeAgent, InferenceClientModel
311-
from mcp.client.stdio import StdioServerParameters
333+
</hfoption>
334+
</hfoptions>
312335

313-
model = InferenceClientModel()
336+
In the video below, we run the agent and ask it to open a new tab in the browser.
314337

315-
server_parameters = StdioServerParameters(
316-
command="uvx",
317-
args=["--quiet", "[email protected]"],
318-
env={"UV_PYTHON": "3.12", **os.environ},
319-
)
338+
The following example shows a web-browsing agent configured to use the [Qwen/Qwen2.5-72B-Instruct](https://huggingface.co/Qwen/Qwen2.5-72B-Instruct) model via Nebius inference provider, and it comes equipped with a playwright MCP server, which lets it use a web browser! The agent config is loaded specifying [its path in the `tiny-agents/tiny-agents`](https://huggingface.co/datasets/tiny-agents/tiny-agents/tree/main/celinah/web-browser) Hugging Face dataset.
320339

321-
with ToolCollection.from_mcp(server_parameters, trust_remote_code=True) as tool_collection:
322-
agent = CodeAgent(tools=[*tool_collection.tools], add_base_tools=True, model=model)
323-
agent.run("Please find a remedy for hangover.")
324-
```
340+
<video controls autoplay loop>
341+
<source src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/blog/python-tiny-agents/web_browser_agent.mp4" type="video/mp4">
342+
</video>
325343

326-
<details>
327-
<summary>
328-
Output
329-
</summary>
344+
When you run the agent, you'll see it load, listing the tools it has discovered from its connected MCP servers. Then, it's ready for your prompts!
330345

331-
```sh
332-
The remedy for hangover is to drink water.
333-
```
346+
Prompt used in this demo:
334347

335-
</details>
348+
> do a Web Search for HF inference providers on Brave Search and open the first result and then give me the list of the inference providers supported on Hugging Face
336349
337350
## Next Steps
338351

units/en/unit1/sdk.mdx

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -27,19 +27,19 @@ from mcp.server.fastmcp import FastMCP
2727
# Create an MCP server
2828
mcp = FastMCP("Weather Service")
2929

30-
30+
# Tool implementation
3131
@mcp.tool()
3232
def get_weather(location: str) -> str:
3333
"""Get the current weather for a specified location."""
3434
return f"Weather in {location}: Sunny, 72°F"
3535

36-
36+
# Resource implementation
3737
@mcp.resource("weather://{location}")
3838
def weather_resource(location: str) -> str:
3939
"""Provide weather data as a resource."""
4040
return f"Weather data for {location}: Sunny, 72°F"
4141

42-
42+
# Prompt implementation
4343
@mcp.prompt()
4444
def weather_report(location: str) -> str:
4545
"""Create a weather report prompt."""

units/en/unit2/gradio-client.mdx

Lines changed: 49 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -12,10 +12,41 @@ We'll connect to the MCP server we created in the previous section and use it to
1212

1313
## MCP Client in Gradio
1414

15-
First, we need to install the `smolagents`, gradio and mcp-client libraries, if we haven't already:
15+
### Connect to an example MCP Server
16+
17+
Let's connect to an example MCP Server that is already running on Hugging Face. We'll use [this one](https://huggingface.co/spaces/abidlabs/mcp-tools2) for this example. It's a space that contains a collection of MCP tools.
18+
19+
```python
20+
from smolagents.mcp_client import MCPClient
21+
22+
with MCPClient(
23+
{"url": "https://abidlabs-mcp-tools2.hf.space/gradio_api/mcp/sse"}
24+
) as tools:
25+
# Tools from the remote server are available
26+
print("\n".join(f"{t.name}: {t.description}" for t in tools))
27+
28+
```
29+
30+
<details>
31+
<summary>Output</summary>
32+
<pre>
33+
<code>
34+
prime_factors: Compute the prime factorization of a positive integer.
35+
generate_cheetah_image: Generate a cheetah image.
36+
image_orientation: Returns whether image is portrait or landscape.
37+
sepia: Apply a sepia filter to the input image.
38+
</code>
39+
</pre>
40+
</details>
41+
42+
### Connect to your MCP Server from Gradio
43+
44+
Great, now that you've connected to an example MCP Server, let's connect to your own MCP Server from Gradio.
45+
46+
First, we need to install the `smolagents`, Gradio and mcp-client libraries, if we haven't already:
1647

1748
```bash
18-
pip install "smolagents[mcp]" "gradio[mcp]" mcp
49+
pip install "smolagents[mcp]" "gradio[mcp]" mcp fastmcp
1950
```
2051

2152
Now, we can import the necessary libraries and create a simple Gradio interface that uses the MCP Client to connect to the MCP Server.
@@ -24,16 +55,15 @@ Now, we can import the necessary libraries and create a simple Gradio interface
2455
```python
2556
import gradio as gr
2657

27-
from mcp.client.stdio import StdioServerParameters
28-
from smolagents import InferenceClientModel, CodeAgent, ToolCollection
29-
from smolagents.mcp_client import MCPClient
58+
from mcp import StdioServerParameters
59+
from smolagents import InferenceClientModel, CodeAgent, ToolCollection, MCPClient
3060
```
3161

3262
Next, we'll connect to the MCP Server and get the tools that we can use to answer questions.
3363

3464
```python
3565
mcp_client = MCPClient(
36-
{"url": "http://localhost:7860/gradio_api/mcp/sse"}
66+
{"url": "http://localhost:7860/gradio_api/mcp/sse"} # This is the MCP Server we created in the previous section
3767
)
3868
tools = mcp_client.get_tools()
3969
```
@@ -77,16 +107,13 @@ Here's the complete example of the MCP Client in Gradio:
77107
```python
78108
import gradio as gr
79109

80-
from mcp.client.stdio import StdioServerParameters
81-
from smolagents import InferenceClientModel, CodeAgent, ToolCollection
82-
from smolagents.mcp_client import MCPClient
110+
from mcp import StdioServerParameters
111+
from smolagents import InferenceClientModel, CodeAgent, ToolCollection, MCPClient
83112

84113

85114
try:
86115
mcp_client = MCPClient(
87-
## Try this working example on the hub:
88-
# {"url": "https://abidlabs-mcp-tools.hf.space/gradio_api/mcp/sse"}
89-
{"url": "http://localhost:7860/gradio_api/mcp/sse"}
116+
{"url": "http://localhost:7860/gradio_api/mcp/sse"} # This is the MCP Server we created in the previous section
90117
)
91118
tools = mcp_client.get_tools()
92119

@@ -119,13 +146,21 @@ To deploy your Gradio MCP client to Hugging Face Spaces:
119146
- Choose "Gradio" as the SDK
120147
- Name your space (e.g., "mcp-client")
121148

122-
2. Create a `requirements.txt` file:
149+
2. Update MCP Server URL in the code:
150+
151+
```python
152+
mcp_client = MCPClient(
153+
{"url": "http://localhost:7860/gradio_api/mcp/sse"} # This is the MCP Server we created in the previous section
154+
)
155+
```
156+
157+
3. Create a `requirements.txt` file:
123158
```txt
124159
gradio[mcp]
125160
smolagents[mcp]
126161
```
127162

128-
3. Push your code to the Space:
163+
4. Push your code to the Space:
129164
```bash
130165
git init
131166
git add server.py requirements.txt

0 commit comments

Comments
 (0)