You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/hub/agents.md
+59-46Lines changed: 59 additions & 46 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -43,19 +43,30 @@ with MCPClient(server_parameters) as tools:
43
43
44
44
Learn more [in the documentation](https://huggingface.co/docs/smolagents/tutorials/tools#use-mcp-tools-with-mcpclient-directly).
45
45
46
+
## tiny-agents (JS and Python)
46
47
47
-
## @huggingface/tiny-agents (JS)
48
+
`tiny-agents` is a lightweight toolkit for running and building MCP-powered agents on top of the Hugging Face Inference Client + Model Context Protocol (MCP). It is available as a JS package `@huggingface/tiny-agents` and in the `huggingface_hub` Python package.
48
49
49
-
`@huggingface/tiny-agents` offers a lightweight toolkit for running and building MCP-powered agents on top of the Hugging Face Inference Client + Model Context Protocol (MCP).
50
+
51
+
### @huggingface/tiny-agents (JS)
52
+
53
+
The `@huggingface/tiny-agents` package offers a simple and straightforward CLI and a simple programmatic API for running and building MCP-powered agents in JS.
50
54
51
55
52
56
**Getting Started**
53
57
58
+
First, you need to install the package:
59
+
54
60
```bash
55
-
npx @huggingface/tiny-agents [command] "agent/id"
61
+
npm install @huggingface/tiny-agents
62
+
# or
63
+
pnpm add @huggingface/tiny-agents
56
64
```
57
65
58
-
```
66
+
Then, you can your agent:
67
+
```bash
68
+
npx @huggingface/tiny-agents [command] "agent/id"
69
+
59
70
Usage:
60
71
tiny-agents [flags]
61
72
tiny-agents run "agent/id"
@@ -68,53 +79,12 @@ Available Commands:
68
79
69
80
You can load agents directly from the Hugging Face Hub [tiny-agents](https://huggingface.co/datasets/tiny-agents/tiny-agents) Dataset, or specify a path to your own local agent configuration.
70
81
71
-
**Define Custom Agents**
72
-
73
-
To create your own agent, set up a folder (e.g., `my-agent/`) with an `agent.json` file. The following example shows a web-browsing agent configured to use the [Qwen/Qwen2.5-72B-Instruct](https://huggingface.co/Qwen/Qwen2.5-72B-Instruct) model via Nebius inference provider, and it comes equipped with a playwright MCP server, which lets it use a web browser
74
-
75
-
```json
76
-
{
77
-
"model": "Qwen/Qwen2.5-72B-Instruct",
78
-
"provider": "nebius",
79
-
"servers": [
80
-
{
81
-
"type": "stdio",
82
-
"config": {
83
-
"command": "npx",
84
-
"args": ["@playwright/mcp@latest"]
85
-
}
86
-
}
87
-
]
88
-
}
89
-
```
90
-
91
-
To use a local LLM (such as [llama.cpp](https://github.com/ggerganov/llama.cpp), or [LM Studio](https://lmstudio.ai/)), just provide an `endpointUrl`:
92
-
93
-
```json
94
-
{
95
-
"model": "Qwen/Qwen3-32B",
96
-
"endpointUrl": "http://localhost:1234/v1",
97
-
"servers": [
98
-
{
99
-
"type": "stdio",
100
-
"config": {
101
-
"command": "npx",
102
-
"args": ["@playwright/mcp@latest"]
103
-
}
104
-
}
105
-
]
106
-
}
107
-
108
-
```
109
-
110
-
Optionally, add a `PROMPT.md` to customize the system prompt.
111
-
112
82
**Advanced Usage**
113
83
In addition to the CLI, you can use the `Agent` class for more fine-grained control. For lower-level interactions, use the `MCPClient` from the `@huggingface/mcp-client` package to connect directly to MCP servers and manage tool calls.
114
84
115
85
Learn more about tiny-agents in the [huggingface.js documentation](https://huggingface.co/docs/huggingface.js/en/tiny-agents/README).
116
86
117
-
## huggingface_hub (Python)
87
+
###huggingface_hub (Python)
118
88
119
89
The `huggingface_hub` library is the easiest way to run MCP-powered agents in Python. It includes a high-level `tiny-agents` CLI as well as programmatic access via the `Agent` and `MCPClient` classes — all built to work with [Hugging Face Inference Providers](https://huggingface.co/docs/inference-providers/index), local LLMs, or any inference endpoint compatible with OpenAI's API specs.
120
90
@@ -151,6 +121,49 @@ For more fine-grained control, use the `MCPClient` directly. This low-level inte
151
121
152
122
Learn more in the [`huggingface_hub` MCP documentation](https://huggingface.co/docs/huggingface_hub/main/en/package_reference/mcp).
153
123
124
+
125
+
### Custom Agents
126
+
127
+
To create your own agent, simply create a folder (e.g., `my-agent/`) and define your agent’s configuration in an `agent.json` file.
128
+
The following example shows a web-browsing agent configured to use the [Qwen/Qwen2.5-72B-Instruct](https://huggingface.co/Qwen/Qwen2.5-72B-Instruct) model via Nebius inference provider, and it comes equipped with a playwright MCP server, which lets it use a web browser
129
+
130
+
```json
131
+
{
132
+
"model": "Qwen/Qwen2.5-72B-Instruct",
133
+
"provider": "nebius",
134
+
"servers": [
135
+
{
136
+
"type": "stdio",
137
+
"config": {
138
+
"command": "npx",
139
+
"args": ["@playwright/mcp@latest"]
140
+
}
141
+
}
142
+
]
143
+
}
144
+
```
145
+
146
+
To use a local LLM (such as [llama.cpp](https://github.com/ggerganov/llama.cpp), or [LM Studio](https://lmstudio.ai/)), just provide an `endpointUrl`:
147
+
148
+
```json
149
+
{
150
+
"model": "Qwen/Qwen3-32B",
151
+
"endpointUrl": "http://localhost:1234/v1",
152
+
"servers": [
153
+
{
154
+
"type": "stdio",
155
+
"config": {
156
+
"command": "npx",
157
+
"args": ["@playwright/mcp@latest"]
158
+
}
159
+
}
160
+
]
161
+
}
162
+
163
+
```
164
+
165
+
Optionally, add a `PROMPT.md` to customize the system prompt.
166
+
154
167
<Tip>
155
168
156
169
Don't hesitate to contribute your agent to the community by opening a Pull Request in the [tiny-agents](https://huggingface.co/datasets/tiny-agents/tiny-agents) Hugging Face dataset.
0 commit comments