Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
179 changes: 88 additions & 91 deletions units/en/unit1/mcp-clients.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,14 @@ In this section, you will:
* Discover how to use Hugging Face's MCP Client implementation
* See practical examples of MCP Client usage

<Tip>

In this page we're going to show examples of how to set up MCP Clients in a few different ways using the JSON notation. For now, we will use *examples* like `path/to/server.py` to represent the path to the MCP Server. In the next unit, we'll implement this with real MCP Servers.

For now, focus on understanding the MCP Client notation. We'll implement the MCP Servers in the next unit.

</Tip>

## Understanding MCP Clients

MCP Clients are crucial components that act as the bridge between AI applications (Hosts) and external capabilities provided by MCP Servers. Think of the Host as your main application (like an AI assistant or IDE) and the Client as a specialized module within that Host responsible for handling MCP communications.
Expand Down Expand Up @@ -52,6 +60,8 @@ Fortunately, the configuration files are very simple, easy to understand, and co

The standard configuration file for MCP is named `mcp.json`. Here's the basic structure:

This is the basic structure of the `mcp.json` can be passed to applications like Claude Desktop, Cursor, or VS Code.

```json
{
"servers": [
Expand Down Expand Up @@ -114,14 +124,6 @@ Environment variables can be passed to server processes using the `env` field. H
<hfoptions id="env-variables">
<hfoption id="python">

Let's first install the packages we need to run the examples.

```bash
pip install "mcp[cli]" "smolagents[mcp]" "fastmcp[cli]"
# or if you are using uv
uv add "mcp[cli]" "smolagents[mcp]" "fastmcp[cli]"
```

In Python, we use the `os` module to access environment variables:

```python
Expand Down Expand Up @@ -170,7 +172,7 @@ The corresponding configuration in `mcp.json` would look like this:
"transport": {
"type": "stdio",
"command": "python",
"args": ["/path/to/github_server.py"],
"args": ["/path/to/github_server.py"], // This is an example, we'll use a real server in the next unit
"env": {
"GITHUB_TOKEN": "your_github_token"
}
Expand All @@ -196,7 +198,7 @@ In this scenario, we have a local server that is a Python script which could be
"transport": {
"type": "stdio",
"command": "python",
"args": ["/path/to/file_explorer_server.py"]
"args": ["/path/to/file_explorer_server.py"] // This is an example, we'll use a real server in the next unit
}
}
]
Expand All @@ -214,7 +216,7 @@ In this scenario, we have a remote server that is a weather API.
"name": "Weather API",
"transport": {
"type": "sse",
"url": "https://example.com/mcp-server"
"url": "https://example.com/mcp-server" // This is an example, we'll use a real server in the next unit
}
}
]
Expand All @@ -225,122 +227,117 @@ Proper configuration is essential for successfully deploying MCP integrations. B

In the next section, we'll explore the ecosystem of MCP servers available on Hugging Face Hub and how to publish your own servers there.

## Code Clients

You can also use the MCP Client within code so that the tools are available to the LLM. Let's explore some examples in `smolagents`. To run these examples you will need to add `mcp[cli]`, `smolagents[toolkit]`, and `smolagents[mcp]` to your `uv` virtual environment.
## Tiny Agents Clients

First, let's explore our weather server from the previous page. In `smolagents`, we can use the `ToolCollection` class to automatically discover and register tools from an MCP server. This is done by passing the `StdioServerParameters` or `SSEServerParameters` to the `ToolCollection.from_mcp` method. We can then print the tools to the console.
Now, let's explore how to use MCP Clients within code.

```python
from smolagents import ToolCollection
from mcp.client.stdio import StdioServerParameters
You can also use tiny agents as MCP Clients to connect directly to MCP servers from your code. Tiny agents provide a simple way to create AI agents that can use tools from MCP servers.

server_parameters = StdioServerParameters(command="uv", args=["run", "server.py"])
Tiny Agent can run MCP servers with a command line environment. To do this, we will need to install `npm` and run the server with `npx`. **We'll need these for both Python and JavaScript.**

with ToolCollection.from_mcp(
server_parameters, trust_remote_code=True
) as tools:
print("\n".join(f"{tool.name}: {tool.description}" for tool in tools.tools))
Let's install `npx` with `npm`. If you don't have `npm` installed, check out the [npm documentation](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm).

```bash
# install npx
npm install -g npx
```

<details>
<summary>
Output
</summary>
<hfoptions id="language">
<hfoption id="python">

```sh
get_weather: Get the current weather for a specified location.
First, install the tiny agents package:

```bash
pip install "huggingface_hub[mcp]>=0.32.0"
```

</details>
First, let's login to the Hugging Face Hub. You will need a [login token](https://huggingface.co/docs/huggingface_hub/v0.32.3/en/quick-start#authentication) to do this.

We can also connect to an MCP server that is hosted on a remote machine. In this case, we need to pass the `SSEServerParameters` to the `MCPClient` class.

```python
from smolagents.mcp_client import MCPClient

with MCPClient(
{"url": "https://abidlabs-mcp-tools.hf.space/gradio_api/mcp/sse"}
) as tools:
# Tools from the remote server are available
print("\n".join(f"{t.name}: {t.description}" for t in tools))
```bash
huggingface-cli login
```

<details>
<summary>
Output
</summary>
Now, let's create an agent configuration file `agent.json`.

```sh
prime_factors: Compute the prime factorization of a positive integer.
generate_cheetah_image: Generate a cheetah image.
image_orientation: Returns whether image is portrait or landscape.
sepia: Apply a sepia filter to the input image.
```json
{
"model": "Qwen/Qwen2.5-72B-Instruct",
"provider": "nebius",
"servers": [
{
"type": "stdio",
"config": {
"command": "npx",
"args": ["@playwright/mcp@latest"]
}
}
]
}
```

</details>
In this configuration, we are using the `@playwright/mcp` MCP server. This is a MCP server that can control a browser with Playwright.

Now, let's see how we can use the MCP Client in a code agent.
Now you can run the agent:

```python
from smolagents import InferenceClientModel, CodeAgent, ToolCollection
from mcp.client.stdio import StdioServerParameters
```bash
tiny-agents run agent.json
```

model = InferenceClientModel()
<hfoption id="javascript">

server_parameters = StdioServerParameters(command="uv", args=["run", "server.py"])
First, install the tiny agents package with [npm](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm).

with ToolCollection.from_mcp(
server_parameters, trust_remote_code=True
) as tool_collection:
agent = CodeAgent(tools=[*tool_collection.tools], model=model)
agent.run("What's the weather in Tokyo?")
```bash
npm install @huggingface/tiny-agents
```

Make an agent project directory and create an `agent.json` file.

```bash
mkdir my-agent
touch my-agent/agent.json
```

<details>
<summary>
Output
</summary>
Create an agent configuration file at `my-agent/agent.json`:

```sh
The weather in Tokyo is sunny with a temperature of 20 degrees Celsius.
```json
{
"model": "Qwen/Qwen2.5-72B-Instruct",
"provider": "nebius",
"servers": [
{
"type": "stdio",
"config": {
"command": "npx",
"args": ["@playwright/mcp@latest"]
}
}
]
}
```

</details>
Now you can run the agent:

We can also connect to an MCP package. Here's an example of connecting to the `pubmedmcp` package.
```bash
npx @huggingface/tiny-agents run ./my-agent
```

```python
import os
from smolagents import ToolCollection, CodeAgent, InferenceClientModel
from mcp.client.stdio import StdioServerParameters
</hfoption>
</hfoptions>

model = InferenceClientModel()
In the video below, we run the agent and ask it to open a new tab in the browser.

server_parameters = StdioServerParameters(
command="uvx",
args=["--quiet", "[email protected]"],
env={"UV_PYTHON": "3.12", **os.environ},
)
The following example shows a web-browsing agent configured to use the [Qwen/Qwen2.5-72B-Instruct](https://huggingface.co/Qwen/Qwen2.5-72B-Instruct) model via Nebius inference provider, and it comes equipped with a playwright MCP server, which lets it use a web browser! The agent config is loaded specifying [its path in the `tiny-agents/tiny-agents`](https://huggingface.co/datasets/tiny-agents/tiny-agents/tree/main/celinah/web-browser) Hugging Face dataset.

with ToolCollection.from_mcp(server_parameters, trust_remote_code=True) as tool_collection:
agent = CodeAgent(tools=[*tool_collection.tools], add_base_tools=True, model=model)
agent.run("Please find a remedy for hangover.")
```
<video controls autoplay loop>
<source src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/blog/python-tiny-agents/web_browser_agent.mp4" type="video/mp4">
</video>

<details>
<summary>
Output
</summary>
When you run the agent, you'll see it load, listing the tools it has discovered from its connected MCP servers. Then, it's ready for your prompts!

```sh
The remedy for hangover is to drink water.
```
Prompt used in this demo:

</details>
> do a Web Search for HF inference providers on Brave Search and open the first result and then give me the list of the inference providers supported on Hugging Face

## Next Steps

Expand Down
30 changes: 17 additions & 13 deletions units/en/unit2/gradio-client.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ We'll connect to the MCP server we created in the previous section and use it to
First, we need to install the `smolagents`, gradio and mcp-client libraries, if we haven't already:

```bash
pip install "smolagents[mcp]" "gradio[mcp]" mcp
pip install "smolagents[mcp]" "gradio[mcp]" mcp fastmcp
```

Now, we can import the necessary libraries and create a simple Gradio interface that uses the MCP Client to connect to the MCP Server.
Expand All @@ -24,16 +24,15 @@ Now, we can import the necessary libraries and create a simple Gradio interface
```python
import gradio as gr

from mcp.client.stdio import StdioServerParameters
from smolagents import InferenceClientModel, CodeAgent, ToolCollection
from smolagents.mcp_client import MCPClient
from mcp import StdioServerParameters
from smolagents import InferenceClientModel, CodeAgent, ToolCollection, MCPClient
```

Next, we'll connect to the MCP Server and get the tools that we can use to answer questions.

```python
mcp_client = MCPClient(
{"url": "http://localhost:7860/gradio_api/mcp/sse"}
{"url": "http://localhost:7860/gradio_api/mcp/sse"} # This is the MCP Server we created in the previous section
)
tools = mcp_client.get_tools()
```
Expand Down Expand Up @@ -77,16 +76,13 @@ Here's the complete example of the MCP Client in Gradio:
```python
import gradio as gr

from mcp.client.stdio import StdioServerParameters
from smolagents import InferenceClientModel, CodeAgent, ToolCollection
from smolagents.mcp_client import MCPClient
from mcp import StdioServerParameters
from smolagents import InferenceClientModel, CodeAgent, ToolCollection, MCPClient


try:
mcp_client = MCPClient(
## Try this working example on the hub:
# {"url": "https://abidlabs-mcp-tools.hf.space/gradio_api/mcp/sse"}
{"url": "http://localhost:7860/gradio_api/mcp/sse"}
{"url": "http://localhost:7860/gradio_api/mcp/sse"} # This is the MCP Server we created in the previous section
)
tools = mcp_client.get_tools()

Expand Down Expand Up @@ -119,13 +115,21 @@ To deploy your Gradio MCP client to Hugging Face Spaces:
- Choose "Gradio" as the SDK
- Name your space (e.g., "mcp-client")

2. Create a `requirements.txt` file:
2. Update MCP Server URL in the code:

```python
mcp_client = MCPClient(
{"url": "http://localhost:7860/gradio_api/mcp/sse"} # This is the MCP Server we created in the previous section
)
```

3. Create a `requirements.txt` file:
```txt
gradio[mcp]
smolagents[mcp]
```

3. Push your code to the Space:
4. Push your code to the Space:
```bash
git init
git add server.py requirements.txt
Expand Down
Loading