|
1 | 1 | # Agents on the Hub |
2 | 2 |
|
3 | | -This page compiles all the libraries and tools Hugging Face offers for agentic workflows: huggingface.js mcp-client, Gradio MCP Server and smolagents. |
| 3 | +This page compiles all the libraries and tools Hugging Face offers for agentic workflows: |
| 4 | +- `tiny-agents`: A lightweight toolkit for MCP-powered agents, available in both JS (`@huggingface/tiny-agents`) and Python (`huggingface_hub`). |
| 5 | +- `Gradio MCP Server`: Easily create MCP servers from Gradio apps and Spaces. |
| 6 | +- `smolagents`: a Python library that enables you to run powerful agents in a few lines of code. |
4 | 7 |
|
5 | 8 | ## smolagents |
6 | 9 |
|
@@ -43,27 +46,132 @@ with MCPClient(server_parameters) as tools: |
43 | 46 |
|
44 | 47 | Learn more [in the documentation](https://huggingface.co/docs/smolagents/tutorials/tools#use-mcp-tools-with-mcpclient-directly). |
45 | 48 |
|
46 | | -## huggingface.js mcp-client |
| 49 | +## tiny-agents (JS and Python) |
47 | 50 |
|
48 | | -Huggingface.js offers an MCP client served with [Inference Providers](https://huggingface.co/docs/inference-providers/en/index) or local LLMs. Getting started with them is as simple as running `pnpm agent`. You can plug and play different models and providers by setting `PROVIDER` and `MODEL_ID` environment variables. |
| 51 | +`tiny-agents` is a lightweight toolkit for running and building MCP-powered agents on top of the Hugging Face Inference Client + Model Context Protocol (MCP). It is available as a JS package `@huggingface/tiny-agents` and in the `huggingface_hub` Python package. |
| 52 | + |
| 53 | + |
| 54 | +### @huggingface/tiny-agents (JS) |
| 55 | + |
| 56 | +The `@huggingface/tiny-agents` package offers a simple and straightforward CLI and a simple programmatic API for running and building MCP-powered agents in JS. |
| 57 | + |
| 58 | + |
| 59 | +**Getting Started** |
| 60 | + |
| 61 | +First, you need to install the package: |
49 | 62 |
|
50 | 63 | ```bash |
51 | | -export HF_TOKEN="hf_..." |
52 | | -export MODEL_ID="Qwen/Qwen2.5-72B-Instruct" |
53 | | -export PROVIDER="nebius" |
54 | | -npx @huggingface/mcp-client |
| 64 | +npm install @huggingface/tiny-agents |
| 65 | +# or |
| 66 | +pnpm add @huggingface/tiny-agents |
| 67 | +``` |
| 68 | + |
| 69 | +Then, you can your agent: |
| 70 | +```bash |
| 71 | +npx @huggingface/tiny-agents [command] "agent/id" |
| 72 | + |
| 73 | +Usage: |
| 74 | + tiny-agents [flags] |
| 75 | + tiny-agents run "agent/id" |
| 76 | + tiny-agents serve "agent/id" |
| 77 | + |
| 78 | +Available Commands: |
| 79 | + run Run the Agent in command-line |
| 80 | + serve Run the Agent as an OpenAI-compatible HTTP server |
55 | 81 | ``` |
56 | 82 |
|
57 | | -or, you can use any Local LLM (for example via lmstudio): |
| 83 | +You can load agents directly from the [tiny-agents](https://huggingface.co/datasets/tiny-agents/tiny-agents) Dataset, or specify a path to your own local agent configuration. |
| 84 | + |
| 85 | +**Advanced Usage** |
| 86 | +In addition to the CLI, you can use the `Agent` class for more fine-grained control. For lower-level interactions, use the `MCPClient` from the `@huggingface/mcp-client` package to connect directly to MCP servers and manage tool calls. |
| 87 | + |
| 88 | +Learn more about tiny-agents in the [huggingface.js documentation](https://huggingface.co/docs/huggingface.js/en/tiny-agents/README). |
| 89 | + |
| 90 | +### huggingface_hub (Python) |
| 91 | + |
| 92 | +The `huggingface_hub` library is the easiest way to run MCP-powered agents in Python. It includes a high-level `tiny-agents` CLI as well as programmatic access via the `Agent` and `MCPClient` classes — all built to work with [Hugging Face Inference Providers](https://huggingface.co/docs/inference-providers/index), local LLMs, or any inference endpoint compatible with OpenAI's API specs. |
58 | 93 |
|
| 94 | +**Getting started** |
| 95 | + |
| 96 | +Install the latest version with MCP support: |
| 97 | +```bash |
| 98 | +pip install "huggingface_hub[mcp]>=0.32.2" |
| 99 | +``` |
| 100 | +Then, you can run your agent: |
59 | 101 | ```bash |
60 | | -ENDPOINT_URL=http://localhost:1234/v1 \ |
61 | | -MODEL_ID=lmstudio-community/Qwen3-14B-GGUF \ |
62 | | -npx @huggingface/mcp-client |
| 102 | +> tiny-agents run --help |
| 103 | + |
| 104 | + Usage: tiny-agents run [OPTIONS] [PATH] COMMAND [ARGS]... |
| 105 | + |
| 106 | + Run the Agent in the CLI |
| 107 | + |
| 108 | + |
| 109 | +╭─ Arguments ───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮ |
| 110 | +│ path [PATH] Path to a local folder containing an agent.json file or a built-in agent stored in the 'tiny-agents/tiny-agents' Hugging Face dataset │ |
| 111 | +│ (https://huggingface.co/datasets/tiny-agents/tiny-agents) │ |
| 112 | +╰───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ |
| 113 | +╭─ Options ─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮ |
| 114 | +│ --help Show this message and exit. │ |
| 115 | +╰───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ |
| 116 | + |
| 117 | +``` |
| 118 | + |
| 119 | +The CLI pulls the config, connects to its MCP servers, prints the available tools, and waits for your prompt. |
| 120 | + |
| 121 | +**Advanced Usage** |
| 122 | + |
| 123 | +For more fine-grained control, use the `MCPClient` directly. This low-level interface extends `AsyncInferenceClient` and allows LLMs to call tools via the Model Context Protocol (MCP). It supports both local (`stdio`) and remote (`http`/`sse`) MCP servers, handles tool registration and execution, and streams results back to the model in real-time. |
| 124 | + |
| 125 | +Learn more in the [`huggingface_hub` MCP documentation](https://huggingface.co/docs/huggingface_hub/main/en/package_reference/mcp). |
| 126 | + |
| 127 | + |
| 128 | +### Custom Agents |
| 129 | + |
| 130 | +To create your own agent, simply create a folder (e.g., `my-agent/`) and define your agent’s configuration in an `agent.json` file. |
| 131 | +The following example shows a web-browsing agent configured to use the [Qwen/Qwen2.5-72B-Instruct](https://huggingface.co/Qwen/Qwen2.5-72B-Instruct) model via Nebius inference provider, and it comes equipped with a playwright MCP server, which lets it use a web browser |
| 132 | + |
| 133 | +```json |
| 134 | +{ |
| 135 | + "model": "Qwen/Qwen2.5-72B-Instruct", |
| 136 | + "provider": "nebius", |
| 137 | + "servers": [ |
| 138 | + { |
| 139 | + "type": "stdio", |
| 140 | + "config": { |
| 141 | + "command": "npx", |
| 142 | + "args": ["@playwright/mcp@latest"] |
| 143 | + } |
| 144 | + } |
| 145 | + ] |
| 146 | +} |
| 147 | +``` |
| 148 | + |
| 149 | +To use a local LLM (such as [llama.cpp](https://github.com/ggerganov/llama.cpp), or [LM Studio](https://lmstudio.ai/)), just provide an `endpointUrl`: |
| 150 | + |
| 151 | +```json |
| 152 | +{ |
| 153 | + "model": "Qwen/Qwen3-32B", |
| 154 | + "endpointUrl": "http://localhost:1234/v1", |
| 155 | + "servers": [ |
| 156 | + { |
| 157 | + "type": "stdio", |
| 158 | + "config": { |
| 159 | + "command": "npx", |
| 160 | + "args": ["@playwright/mcp@latest"] |
| 161 | + } |
| 162 | + } |
| 163 | + ] |
| 164 | +} |
| 165 | + |
63 | 166 | ``` |
64 | 167 |
|
65 | | -You can get more information about mcp-client [here](https://huggingface.co/docs/huggingface.js/en/mcp-client/README). |
| 168 | +Optionally, add a `PROMPT.md` to customize the system prompt. |
| 169 | + |
| 170 | +<Tip> |
| 171 | + |
| 172 | +Don't hesitate to contribute your agent to the community by opening a Pull Request in the [tiny-agents](https://huggingface.co/datasets/tiny-agents/tiny-agents) Hugging Face dataset. |
66 | 173 |
|
| 174 | +</Tip> |
67 | 175 |
|
68 | 176 | ## Gradio MCP Server / Tools |
69 | 177 |
|
|
0 commit comments