|
1 | 1 | # Agents on the Hub |
2 | 2 |
|
3 | | -This page compiles all the libraries and tools Hugging Face offers for agentic workflows: huggingface.js mcp-client, Gradio MCP Server and smolagents. |
| 3 | +This page compiles all the libraries and tools Hugging Face offers for agentic workflows: |
| 4 | +- `HF MCP Server`: Connect your MCP-compatible AI assistant directly to the Hugging Face Hub. |
| 5 | +- `tiny-agents`: A lightweight toolkit for MCP-powered agents, available in both JS (`@huggingface/tiny-agents`) and Python (`huggingface_hub`). |
| 6 | +- `Gradio MCP Server`: Easily create MCP servers from Gradio apps and Spaces. |
| 7 | +- `smolagents`: a Python library that enables you to run powerful agents in a few lines of code. |
| 8 | + |
| 9 | +## HF MCP Server |
| 10 | + |
| 11 | +The official **Hugging Face MCP (Model Context Protocol) Server** enables seamless integration between the Hugging Face Hub and any MCP-compatible AI assistant—including VSCode, Cursor, and Claude Desktop. |
| 12 | + |
| 13 | +With the HF MCP Server, you can enhance your AI assistant's capabilities by connecting directly to the Hub's ecosystem. It comes with: |
| 14 | +- a curated set of **built-in tools** like Spaces and Papers Semantic Search, Model and Dataset exploration, etc |
| 15 | +- **MCP-compatible Gradio apps**: Connect to any [MCP-compatible Gradio app](https://huggingface.co/spaces?filter=mcp-server) built by the Hugging Face community |
| 16 | + |
| 17 | +#### Getting Started |
| 18 | + |
| 19 | +Visit [huggingface.co/settings/mcp](https://huggingface.co/settings/mcp) to configure your MCP client and get started. |
| 20 | + |
| 21 | +<Tip warning={true}> |
| 22 | + |
| 23 | +This feature is experimental ⚗️ and will continue to evolve. |
| 24 | + |
| 25 | +</Tip> |
4 | 26 |
|
5 | 27 | ## smolagents |
6 | 28 |
|
@@ -43,27 +65,132 @@ with MCPClient(server_parameters) as tools: |
43 | 65 |
|
44 | 66 | Learn more [in the documentation](https://huggingface.co/docs/smolagents/tutorials/tools#use-mcp-tools-with-mcpclient-directly). |
45 | 67 |
|
46 | | -## huggingface.js mcp-client |
| 68 | +## tiny-agents (JS and Python) |
| 69 | + |
| 70 | +`tiny-agents` is a lightweight toolkit for running and building MCP-powered agents on top of the Hugging Face Inference Client + Model Context Protocol (MCP). It is available as a JS package `@huggingface/tiny-agents` and in the `huggingface_hub` Python package. |
| 71 | + |
| 72 | + |
| 73 | +### @huggingface/tiny-agents (JS) |
| 74 | + |
| 75 | +The `@huggingface/tiny-agents` package offers a simple and straightforward CLI and a simple programmatic API for running and building MCP-powered agents in JS. |
47 | 76 |
|
48 | | -Huggingface.js offers an MCP client served with [Inference Providers](https://huggingface.co/docs/inference-providers/en/index) or local LLMs. Getting started with them is as simple as running `pnpm agent`. You can plug and play different models and providers by setting `PROVIDER` and `MODEL_ID` environment variables. |
49 | 77 |
|
| 78 | +**Getting Started** |
| 79 | + |
| 80 | +First, you need to install the package: |
| 81 | + |
| 82 | +```bash |
| 83 | +npm install @huggingface/tiny-agents |
| 84 | +# or |
| 85 | +pnpm add @huggingface/tiny-agents |
| 86 | +``` |
| 87 | + |
| 88 | +Then, you can your agent: |
50 | 89 | ```bash |
51 | | -export HF_TOKEN="hf_..." |
52 | | -export MODEL_ID="Qwen/Qwen2.5-72B-Instruct" |
53 | | -export PROVIDER="nebius" |
54 | | -npx @huggingface/mcp-client |
| 90 | +npx @huggingface/tiny-agents [command] "agent/id" |
| 91 | + |
| 92 | +Usage: |
| 93 | + tiny-agents [flags] |
| 94 | + tiny-agents run "agent/id" |
| 95 | + tiny-agents serve "agent/id" |
| 96 | + |
| 97 | +Available Commands: |
| 98 | + run Run the Agent in command-line |
| 99 | + serve Run the Agent as an OpenAI-compatible HTTP server |
55 | 100 | ``` |
56 | 101 |
|
57 | | -or, you can use any Local LLM (for example via lmstudio): |
| 102 | +You can load agents directly from the [tiny-agents](https://huggingface.co/datasets/tiny-agents/tiny-agents) Dataset, or specify a path to your own local agent configuration. |
| 103 | + |
| 104 | +**Advanced Usage** |
| 105 | +In addition to the CLI, you can use the `Agent` class for more fine-grained control. For lower-level interactions, use the `MCPClient` from the `@huggingface/mcp-client` package to connect directly to MCP servers and manage tool calls. |
| 106 | + |
| 107 | +Learn more about tiny-agents in the [huggingface.js documentation](https://huggingface.co/docs/huggingface.js/en/tiny-agents/README). |
| 108 | + |
| 109 | +### huggingface_hub (Python) |
| 110 | + |
| 111 | +The `huggingface_hub` library is the easiest way to run MCP-powered agents in Python. It includes a high-level `tiny-agents` CLI as well as programmatic access via the `Agent` and `MCPClient` classes — all built to work with [Hugging Face Inference Providers](https://huggingface.co/docs/inference-providers/index), local LLMs, or any inference endpoint compatible with OpenAI's API specs. |
| 112 | + |
| 113 | +**Getting started** |
58 | 114 |
|
| 115 | +Install the latest version with MCP support: |
59 | 116 | ```bash |
60 | | -ENDPOINT_URL=http://localhost:1234/v1 \ |
61 | | -MODEL_ID=lmstudio-community/Qwen3-14B-GGUF \ |
62 | | -npx @huggingface/mcp-client |
| 117 | +pip install "huggingface_hub[mcp]>=0.32.2" |
63 | 118 | ``` |
| 119 | +Then, you can run your agent: |
| 120 | +```bash |
| 121 | +> tiny-agents run --help |
| 122 | + |
| 123 | + Usage: tiny-agents run [OPTIONS] [PATH] COMMAND [ARGS]... |
| 124 | + |
| 125 | + Run the Agent in the CLI |
| 126 | + |
| 127 | + |
| 128 | +╭─ Arguments ───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮ |
| 129 | +│ path [PATH] Path to a local folder containing an agent.json file or a built-in agent stored in the 'tiny-agents/tiny-agents' Hugging Face dataset │ |
| 130 | +│ (https://huggingface.co/datasets/tiny-agents/tiny-agents) │ |
| 131 | +╰───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ |
| 132 | +╭─ Options ─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮ |
| 133 | +│ --help Show this message and exit. │ |
| 134 | +╰───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ |
| 135 | + |
| 136 | +``` |
| 137 | + |
| 138 | +The CLI pulls the config, connects to its MCP servers, prints the available tools, and waits for your prompt. |
| 139 | + |
| 140 | +**Advanced Usage** |
| 141 | + |
| 142 | +For more fine-grained control, use the `MCPClient` directly. This low-level interface extends `AsyncInferenceClient` and allows LLMs to call tools via the Model Context Protocol (MCP). It supports both local (`stdio`) and remote (`http`/`sse`) MCP servers, handles tool registration and execution, and streams results back to the model in real-time. |
| 143 | + |
| 144 | +Learn more in the [`huggingface_hub` MCP documentation](https://huggingface.co/docs/huggingface_hub/main/en/package_reference/mcp). |
| 145 | + |
| 146 | + |
| 147 | +### Custom Agents |
| 148 | + |
| 149 | +To create your own agent, simply create a folder (e.g., `my-agent/`) and define your agent’s configuration in an `agent.json` file. |
| 150 | +The following example shows a web-browsing agent configured to use the [Qwen/Qwen2.5-72B-Instruct](https://huggingface.co/Qwen/Qwen2.5-72B-Instruct) model via Nebius inference provider, and it comes equipped with a playwright MCP server, which lets it use a web browser |
| 151 | + |
| 152 | +```json |
| 153 | +{ |
| 154 | + "model": "Qwen/Qwen2.5-72B-Instruct", |
| 155 | + "provider": "nebius", |
| 156 | + "servers": [ |
| 157 | + { |
| 158 | + "type": "stdio", |
| 159 | + "config": { |
| 160 | + "command": "npx", |
| 161 | + "args": ["@playwright/mcp@latest"] |
| 162 | + } |
| 163 | + } |
| 164 | + ] |
| 165 | +} |
| 166 | +``` |
| 167 | + |
| 168 | +To use a local LLM (such as [llama.cpp](https://github.com/ggerganov/llama.cpp), or [LM Studio](https://lmstudio.ai/)), just provide an `endpointUrl`: |
| 169 | + |
| 170 | +```json |
| 171 | +{ |
| 172 | + "model": "Qwen/Qwen3-32B", |
| 173 | + "endpointUrl": "http://localhost:1234/v1", |
| 174 | + "servers": [ |
| 175 | + { |
| 176 | + "type": "stdio", |
| 177 | + "config": { |
| 178 | + "command": "npx", |
| 179 | + "args": ["@playwright/mcp@latest"] |
| 180 | + } |
| 181 | + } |
| 182 | + ] |
| 183 | +} |
| 184 | + |
| 185 | +``` |
| 186 | + |
| 187 | +Optionally, add a `PROMPT.md` to customize the system prompt. |
| 188 | + |
| 189 | +<Tip> |
64 | 190 |
|
65 | | -You can get more information about mcp-client [here](https://huggingface.co/docs/huggingface.js/en/mcp-client/README). |
| 191 | +Don't hesitate to contribute your agent to the community by opening a Pull Request in the [tiny-agents](https://huggingface.co/datasets/tiny-agents/tiny-agents) Hugging Face dataset. |
66 | 192 |
|
| 193 | +</Tip> |
67 | 194 |
|
68 | 195 | ## Gradio MCP Server / Tools |
69 | 196 |
|
|
0 commit comments