You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: units/en/unit1/mcp-clients.mdx
+88-83Lines changed: 88 additions & 83 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -12,6 +12,14 @@ In this section, you will:
12
12
* Discover how to use Hugging Face's MCP Client implementation
13
13
* See practical examples of MCP Client usage
14
14
15
+
<Tip>
16
+
17
+
In this page we're going to show examples of how to set up MCP Clients in a few different ways using the JSON notation. For now, we will use *examples* like `path/to/server.py` to represent the path to the MCP Server. In the next unit, we'll implement this with real MCP Servers.
18
+
19
+
For now, focus on understanding the MCP Client notation. We'll implement the MCP Servers in the next unit.
20
+
21
+
</Tip>
22
+
15
23
## Understanding MCP Clients
16
24
17
25
MCP Clients are crucial components that act as the bridge between AI applications (Hosts) and external capabilities provided by MCP Servers. Think of the Host as your main application (like an AI assistant or IDE) and the Client as a specialized module within that Host responsible for handling MCP communications.
@@ -52,6 +60,8 @@ Fortunately, the configuration files are very simple, easy to understand, and co
52
60
53
61
The standard configuration file for MCP is named `mcp.json`. Here's the basic structure:
54
62
63
+
This is the basic structure of the `mcp.json` can be passed to applications like Claude Desktop, Cursor, or VS Code.
64
+
55
65
```json
56
66
{
57
67
"servers": [
@@ -162,7 +172,7 @@ The corresponding configuration in `mcp.json` would look like this:
162
172
"transport": {
163
173
"type": "stdio",
164
174
"command": "python",
165
-
"args": ["/path/to/github_server.py"],
175
+
"args": ["/path/to/github_server.py"],// This is an example, we'll use a real server in the next unit
166
176
"env": {
167
177
"GITHUB_TOKEN": "your_github_token"
168
178
}
@@ -188,7 +198,7 @@ In this scenario, we have a local server that is a Python script which could be
188
198
"transport": {
189
199
"type": "stdio",
190
200
"command": "python",
191
-
"args": ["/path/to/file_explorer_server.py"]
201
+
"args": ["/path/to/file_explorer_server.py"]// This is an example, we'll use a real server in the next unit
192
202
}
193
203
}
194
204
]
@@ -206,7 +216,7 @@ In this scenario, we have a remote server that is a weather API.
206
216
"name": "Weather API",
207
217
"transport": {
208
218
"type": "sse",
209
-
"url": "https://example.com/mcp-server"
219
+
"url": "https://example.com/mcp-server"// This is an example, we'll use a real server in the next unit
210
220
}
211
221
}
212
222
]
@@ -217,122 +227,117 @@ Proper configuration is essential for successfully deploying MCP integrations. B
217
227
218
228
In the next section, we'll explore the ecosystem of MCP servers available on Hugging Face Hub and how to publish your own servers there.
219
229
220
-
## Code Clients
230
+
## Tiny Agents Clients
221
231
222
-
You can also use the MCP Client within code so that the tools are available to the LLM. Let's explore some examples in `smolagents`. To run these examples you will need to add `mcp[cli]`, `smolagents[toolkit]`, and `smolagents[mcp]` to your `uv` virtual environment.
232
+
Now, let's explore how to use MCP Clients within code.
223
233
224
-
First, let's explore our weather server from the previous page. In `smolagents`, we can use the `ToolCollection` class to automatically discover and register tools from an MCP server. This is done by passing the `StdioServerParameters` or `SSEServerParameters`to the `ToolCollection.from_mcp` method. We can then print the tools to the console.
234
+
You can also use tiny agents as MCP Clients to connect directly to MCP servers from your code. Tiny agents provide a simple way to create AI agents that can use tools from MCP servers.
225
235
226
-
```python
227
-
from smolagents import ToolCollection
228
-
from mcp.client.stdio import StdioServerParameters
Tiny Agent can run MCP servers with a command line environment. To do this, we will need to install `npm` and run the server with `npx`. **We'll need these for both Python and JavaScript.**
231
237
232
-
with ToolCollection.from_mcp(
233
-
server_parameters, trust_remote_code=True
234
-
) as tools:
235
-
print("\n".join(f"{tool.name}: {tool.description}"for tool in tools.tools))
238
+
Let's install `npx` with `npm`. If you don't have `npm` installed, check out the [npm documentation](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm).
236
239
240
+
```bash
241
+
# install npx
242
+
npm install -g npx
237
243
```
238
244
239
-
<details>
240
-
<summary>
241
-
Output
242
-
</summary>
245
+
<hfoptionsid="language">
246
+
<hfoptionid="python">
243
247
244
-
```sh
245
-
get_weather: Get the current weather for a specified location.
248
+
First, install the tiny agents package:
246
249
250
+
```bash
251
+
pip install "huggingface_hub[mcp]>=0.32.0"
247
252
```
248
253
249
-
</details>
250
-
251
-
We can also connect to an MCP server that is hosted on a remote machine. In this case, we need to pass the `SSEServerParameters` to the `MCPClient` class.
254
+
First, let's login to the Hugging Face Hub. You will need a [login token](https://huggingface.co/docs/huggingface_hub/v0.32.3/en/quick-start#authentication) to do this.
The following example shows a web-browsing agent configured to use the [Qwen/Qwen2.5-72B-Instruct](https://huggingface.co/Qwen/Qwen2.5-72B-Instruct) model via Nebius inference provider, and it comes equipped with a playwright MCP server, which lets it use a web browser! The agent config is loaded specifying [its path in the `tiny-agents/tiny-agents`](https://huggingface.co/datasets/tiny-agents/tiny-agents/tree/main/celinah/web-browser) Hugging Face dataset.
320
331
321
-
with ToolCollection.from_mcp(server_parameters, trust_remote_code=True) as tool_collection:
When you run the agent, you'll see it load, listing the tools it has discovered from its connected MCP servers. Then, it's ready for your prompts!
330
337
331
-
```sh
332
-
The remedy for hangover is to drink water.
333
-
```
338
+
Prompt used in this demo:
334
339
335
-
</details>
340
+
> do a Web Search for HF inference providers on Brave Search and open the first result and then give me the list of the inference providers supported on Hugging Face
0 commit comments