Skip to content

Commit 45465e1

Browse files
authored
Merge pull request #74 from huggingface/make-mcp-client-usage-consistent
Make-mcp-client-usage-consistent
2 parents 1eef4d6 + bc9b0e7 commit 45465e1

File tree

3 files changed

+125
-153
lines changed

3 files changed

+125
-153
lines changed

units/en/unit1/mcp-clients.mdx

Lines changed: 88 additions & 91 deletions
Original file line numberDiff line numberDiff line change
@@ -12,6 +12,14 @@ In this section, you will:
1212
* Discover how to use Hugging Face's MCP Client implementation
1313
* See practical examples of MCP Client usage
1414

15+
<Tip>
16+
17+
In this page we're going to show examples of how to set up MCP Clients in a few different ways using the JSON notation. For now, we will use *examples* like `path/to/server.py` to represent the path to the MCP Server. In the next unit, we'll implement this with real MCP Servers.
18+
19+
For now, focus on understanding the MCP Client notation. We'll implement the MCP Servers in the next unit.
20+
21+
</Tip>
22+
1523
## Understanding MCP Clients
1624

1725
MCP Clients are crucial components that act as the bridge between AI applications (Hosts) and external capabilities provided by MCP Servers. Think of the Host as your main application (like an AI assistant or IDE) and the Client as a specialized module within that Host responsible for handling MCP communications.
@@ -52,6 +60,8 @@ Fortunately, the configuration files are very simple, easy to understand, and co
5260

5361
The standard configuration file for MCP is named `mcp.json`. Here's the basic structure:
5462

63+
This is the basic structure of the `mcp.json` can be passed to applications like Claude Desktop, Cursor, or VS Code.
64+
5565
```json
5666
{
5767
"servers": [
@@ -114,14 +124,6 @@ Environment variables can be passed to server processes using the `env` field. H
114124
<hfoptions id="env-variables">
115125
<hfoption id="python">
116126

117-
Let's first install the packages we need to run the examples.
118-
119-
```bash
120-
pip install "mcp[cli]" "smolagents[mcp]" "fastmcp[cli]"
121-
# or if you are using uv
122-
uv add "mcp[cli]" "smolagents[mcp]" "fastmcp[cli]"
123-
```
124-
125127
In Python, we use the `os` module to access environment variables:
126128

127129
```python
@@ -170,7 +172,7 @@ The corresponding configuration in `mcp.json` would look like this:
170172
"transport": {
171173
"type": "stdio",
172174
"command": "python",
173-
"args": ["/path/to/github_server.py"],
175+
"args": ["/path/to/github_server.py"], // This is an example, we'll use a real server in the next unit
174176
"env": {
175177
"GITHUB_TOKEN": "your_github_token"
176178
}
@@ -196,7 +198,7 @@ In this scenario, we have a local server that is a Python script which could be
196198
"transport": {
197199
"type": "stdio",
198200
"command": "python",
199-
"args": ["/path/to/file_explorer_server.py"]
201+
"args": ["/path/to/file_explorer_server.py"] // This is an example, we'll use a real server in the next unit
200202
}
201203
}
202204
]
@@ -214,7 +216,7 @@ In this scenario, we have a remote server that is a weather API.
214216
"name": "Weather API",
215217
"transport": {
216218
"type": "sse",
217-
"url": "https://example.com/mcp-server"
219+
"url": "https://example.com/mcp-server" // This is an example, we'll use a real server in the next unit
218220
}
219221
}
220222
]
@@ -225,122 +227,117 @@ Proper configuration is essential for successfully deploying MCP integrations. B
225227

226228
In the next section, we'll explore the ecosystem of MCP servers available on Hugging Face Hub and how to publish your own servers there.
227229

228-
## Code Clients
229-
230-
You can also use the MCP Client within code so that the tools are available to the LLM. Let's explore some examples in `smolagents`. To run these examples you will need to add `mcp[cli]`, `smolagents[toolkit]`, and `smolagents[mcp]` to your `uv` virtual environment.
230+
## Tiny Agents Clients
231231

232-
First, let's explore our weather server from the previous page. In `smolagents`, we can use the `ToolCollection` class to automatically discover and register tools from an MCP server. This is done by passing the `StdioServerParameters` or `SSEServerParameters` to the `ToolCollection.from_mcp` method. We can then print the tools to the console.
232+
Now, let's explore how to use MCP Clients within code.
233233

234-
```python
235-
from smolagents import ToolCollection
236-
from mcp.client.stdio import StdioServerParameters
234+
You can also use tiny agents as MCP Clients to connect directly to MCP servers from your code. Tiny agents provide a simple way to create AI agents that can use tools from MCP servers.
237235

238-
server_parameters = StdioServerParameters(command="uv", args=["run", "server.py"])
236+
Tiny Agent can run MCP servers with a command line environment. To do this, we will need to install `npm` and run the server with `npx`. **We'll need these for both Python and JavaScript.**
239237

240-
with ToolCollection.from_mcp(
241-
server_parameters, trust_remote_code=True
242-
) as tools:
243-
print("\n".join(f"{tool.name}: {tool.description}" for tool in tools.tools))
238+
Let's install `npx` with `npm`. If you don't have `npm` installed, check out the [npm documentation](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm).
244239

240+
```bash
241+
# install npx
242+
npm install -g npx
245243
```
246244

247-
<details>
248-
<summary>
249-
Output
250-
</summary>
245+
<hfoptions id="language">
246+
<hfoption id="python">
251247

252-
```sh
253-
get_weather: Get the current weather for a specified location.
248+
First, install the tiny agents package:
254249

250+
```bash
251+
pip install "huggingface_hub[mcp]>=0.32.0"
255252
```
256253

257-
</details>
254+
First, let's login to the Hugging Face Hub. You will need a [login token](https://huggingface.co/docs/huggingface_hub/v0.32.3/en/quick-start#authentication) to do this.
258255

259-
We can also connect to an MCP server that is hosted on a remote machine. In this case, we need to pass the `SSEServerParameters` to the `MCPClient` class.
260-
261-
```python
262-
from smolagents.mcp_client import MCPClient
263-
264-
with MCPClient(
265-
{"url": "https://abidlabs-mcp-tools.hf.space/gradio_api/mcp/sse"}
266-
) as tools:
267-
# Tools from the remote server are available
268-
print("\n".join(f"{t.name}: {t.description}" for t in tools))
256+
```bash
257+
huggingface-cli login
269258
```
270259

271-
<details>
272-
<summary>
273-
Output
274-
</summary>
260+
Now, let's create an agent configuration file `agent.json`.
275261

276-
```sh
277-
prime_factors: Compute the prime factorization of a positive integer.
278-
generate_cheetah_image: Generate a cheetah image.
279-
image_orientation: Returns whether image is portrait or landscape.
280-
sepia: Apply a sepia filter to the input image.
262+
```json
263+
{
264+
"model": "Qwen/Qwen2.5-72B-Instruct",
265+
"provider": "nebius",
266+
"servers": [
267+
{
268+
"type": "stdio",
269+
"config": {
270+
"command": "npx",
271+
"args": ["@playwright/mcp@latest"]
272+
}
273+
}
274+
]
275+
}
281276
```
282277

283-
</details>
278+
In this configuration, we are using the `@playwright/mcp` MCP server. This is a MCP server that can control a browser with Playwright.
284279

285-
Now, let's see how we can use the MCP Client in a code agent.
280+
Now you can run the agent:
286281

287-
```python
288-
from smolagents import InferenceClientModel, CodeAgent, ToolCollection
289-
from mcp.client.stdio import StdioServerParameters
282+
```bash
283+
tiny-agents run agent.json
284+
```
290285

291-
model = InferenceClientModel()
286+
<hfoption id="javascript">
292287

293-
server_parameters = StdioServerParameters(command="uv", args=["run", "server.py"])
288+
First, install the tiny agents package with [npm](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm).
294289

295-
with ToolCollection.from_mcp(
296-
server_parameters, trust_remote_code=True
297-
) as tool_collection:
298-
agent = CodeAgent(tools=[*tool_collection.tools], model=model)
299-
agent.run("What's the weather in Tokyo?")
290+
```bash
291+
npm install @huggingface/tiny-agents
292+
```
300293

294+
Make an agent project directory and create an `agent.json` file.
295+
296+
```bash
297+
mkdir my-agent
298+
touch my-agent/agent.json
301299
```
302300

303-
<details>
304-
<summary>
305-
Output
306-
</summary>
301+
Create an agent configuration file at `my-agent/agent.json`:
307302

308-
```sh
309-
The weather in Tokyo is sunny with a temperature of 20 degrees Celsius.
303+
```json
304+
{
305+
"model": "Qwen/Qwen2.5-72B-Instruct",
306+
"provider": "nebius",
307+
"servers": [
308+
{
309+
"type": "stdio",
310+
"config": {
311+
"command": "npx",
312+
"args": ["@playwright/mcp@latest"]
313+
}
314+
}
315+
]
316+
}
310317
```
311318

312-
</details>
319+
Now you can run the agent:
313320

314-
We can also connect to an MCP package. Here's an example of connecting to the `pubmedmcp` package.
321+
```bash
322+
npx @huggingface/tiny-agents run ./my-agent
323+
```
315324

316-
```python
317-
import os
318-
from smolagents import ToolCollection, CodeAgent, InferenceClientModel
319-
from mcp.client.stdio import StdioServerParameters
325+
</hfoption>
326+
</hfoptions>
320327

321-
model = InferenceClientModel()
328+
In the video below, we run the agent and ask it to open a new tab in the browser.
322329

323-
server_parameters = StdioServerParameters(
324-
command="uvx",
325-
args=["--quiet", "[email protected]"],
326-
env={"UV_PYTHON": "3.12", **os.environ},
327-
)
330+
The following example shows a web-browsing agent configured to use the [Qwen/Qwen2.5-72B-Instruct](https://huggingface.co/Qwen/Qwen2.5-72B-Instruct) model via Nebius inference provider, and it comes equipped with a playwright MCP server, which lets it use a web browser! The agent config is loaded specifying [its path in the `tiny-agents/tiny-agents`](https://huggingface.co/datasets/tiny-agents/tiny-agents/tree/main/celinah/web-browser) Hugging Face dataset.
328331

329-
with ToolCollection.from_mcp(server_parameters, trust_remote_code=True) as tool_collection:
330-
agent = CodeAgent(tools=[*tool_collection.tools], add_base_tools=True, model=model)
331-
agent.run("Please find a remedy for hangover.")
332-
```
332+
<video controls autoplay loop>
333+
<source src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/blog/python-tiny-agents/web_browser_agent.mp4" type="video/mp4">
334+
</video>
333335

334-
<details>
335-
<summary>
336-
Output
337-
</summary>
336+
When you run the agent, you'll see it load, listing the tools it has discovered from its connected MCP servers. Then, it's ready for your prompts!
338337

339-
```sh
340-
The remedy for hangover is to drink water.
341-
```
338+
Prompt used in this demo:
342339

343-
</details>
340+
> do a Web Search for HF inference providers on Brave Search and open the first result and then give me the list of the inference providers supported on Hugging Face
344341
345342
## Next Steps
346343

units/en/unit2/gradio-client.mdx

Lines changed: 17 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ We'll connect to the MCP server we created in the previous section and use it to
1515
First, we need to install the `smolagents`, gradio and mcp-client libraries, if we haven't already:
1616

1717
```bash
18-
pip install "smolagents[mcp]" "gradio[mcp]" mcp
18+
pip install "smolagents[mcp]" "gradio[mcp]" mcp fastmcp
1919
```
2020

2121
Now, we can import the necessary libraries and create a simple Gradio interface that uses the MCP Client to connect to the MCP Server.
@@ -24,16 +24,15 @@ Now, we can import the necessary libraries and create a simple Gradio interface
2424
```python
2525
import gradio as gr
2626

27-
from mcp.client.stdio import StdioServerParameters
28-
from smolagents import InferenceClientModel, CodeAgent, ToolCollection
29-
from smolagents.mcp_client import MCPClient
27+
from mcp import StdioServerParameters
28+
from smolagents import InferenceClientModel, CodeAgent, ToolCollection, MCPClient
3029
```
3130

3231
Next, we'll connect to the MCP Server and get the tools that we can use to answer questions.
3332

3433
```python
3534
mcp_client = MCPClient(
36-
{"url": "http://localhost:7860/gradio_api/mcp/sse"}
35+
{"url": "http://localhost:7860/gradio_api/mcp/sse"} # This is the MCP Server we created in the previous section
3736
)
3837
tools = mcp_client.get_tools()
3938
```
@@ -77,16 +76,13 @@ Here's the complete example of the MCP Client in Gradio:
7776
```python
7877
import gradio as gr
7978

80-
from mcp.client.stdio import StdioServerParameters
81-
from smolagents import InferenceClientModel, CodeAgent, ToolCollection
82-
from smolagents.mcp_client import MCPClient
79+
from mcp import StdioServerParameters
80+
from smolagents import InferenceClientModel, CodeAgent, ToolCollection, MCPClient
8381

8482

8583
try:
8684
mcp_client = MCPClient(
87-
## Try this working example on the hub:
88-
# {"url": "https://abidlabs-mcp-tools.hf.space/gradio_api/mcp/sse"}
89-
{"url": "http://localhost:7860/gradio_api/mcp/sse"}
85+
{"url": "http://localhost:7860/gradio_api/mcp/sse"} # This is the MCP Server we created in the previous section
9086
)
9187
tools = mcp_client.get_tools()
9288

@@ -119,13 +115,21 @@ To deploy your Gradio MCP client to Hugging Face Spaces:
119115
- Choose "Gradio" as the SDK
120116
- Name your space (e.g., "mcp-client")
121117

122-
2. Create a `requirements.txt` file:
118+
2. Update MCP Server URL in the code:
119+
120+
```python
121+
mcp_client = MCPClient(
122+
{"url": "http://localhost:7860/gradio_api/mcp/sse"} # This is the MCP Server we created in the previous section
123+
)
124+
```
125+
126+
3. Create a `requirements.txt` file:
123127
```txt
124128
gradio[mcp]
125129
smolagents[mcp]
126130
```
127131

128-
3. Push your code to the Space:
132+
4. Push your code to the Space:
129133
```bash
130134
git init
131135
git add server.py requirements.txt

0 commit comments

Comments
 (0)