Skip to content

Commit f9dcd3d

Browse files
committed
review suggestions
1 parent 6f12306 commit f9dcd3d

File tree

1 file changed

+59
-46
lines changed

1 file changed

+59
-46
lines changed

docs/hub/agents.md

Lines changed: 59 additions & 46 deletions
Original file line numberDiff line numberDiff line change
@@ -43,19 +43,30 @@ with MCPClient(server_parameters) as tools:
4343

4444
Learn more [in the documentation](https://huggingface.co/docs/smolagents/tutorials/tools#use-mcp-tools-with-mcpclient-directly).
4545

46+
## tiny-agents (JS and Python)
4647

47-
## @huggingface/tiny-agents (JS)
48+
`tiny-agents` is a lightweight toolkit for running and building MCP-powered agents on top of the Hugging Face Inference Client + Model Context Protocol (MCP). It is available as a JS package `@huggingface/tiny-agents` and in the `huggingface_hub` Python package.
4849

49-
`@huggingface/tiny-agents` offers a lightweight toolkit for running and building MCP-powered agents on top of the Hugging Face Inference Client + Model Context Protocol (MCP).
50+
51+
### @huggingface/tiny-agents (JS)
52+
53+
The `@huggingface/tiny-agents` package offers a simple and straightforward CLI and a simple programmatic API for running and building MCP-powered agents in JS.
5054

5155

5256
**Getting Started**
5357

58+
First, you need to install the package:
59+
5460
```bash
55-
npx @huggingface/tiny-agents [command] "agent/id"
61+
npm install @huggingface/tiny-agents
62+
# or
63+
pnpm add @huggingface/tiny-agents
5664
```
5765

58-
```
66+
Then, you can your agent:
67+
```bash
68+
npx @huggingface/tiny-agents [command] "agent/id"
69+
5970
Usage:
6071
tiny-agents [flags]
6172
tiny-agents run "agent/id"
@@ -68,53 +79,12 @@ Available Commands:
6879

6980
You can load agents directly from the Hugging Face Hub [tiny-agents](https://huggingface.co/datasets/tiny-agents/tiny-agents) Dataset, or specify a path to your own local agent configuration.
7081

71-
**Define Custom Agents**
72-
73-
To create your own agent, set up a folder (e.g., `my-agent/`) with an `agent.json` file. The following example shows a web-browsing agent configured to use the [Qwen/Qwen2.5-72B-Instruct](https://huggingface.co/Qwen/Qwen2.5-72B-Instruct) model via Nebius inference provider, and it comes equipped with a playwright MCP server, which lets it use a web browser
74-
75-
```json
76-
{
77-
"model": "Qwen/Qwen2.5-72B-Instruct",
78-
"provider": "nebius",
79-
"servers": [
80-
{
81-
"type": "stdio",
82-
"config": {
83-
"command": "npx",
84-
"args": ["@playwright/mcp@latest"]
85-
}
86-
}
87-
]
88-
}
89-
```
90-
91-
To use a local LLM (such as [llama.cpp](https://github.com/ggerganov/llama.cpp), or [LM Studio](https://lmstudio.ai/)), just provide an `endpointUrl`:
92-
93-
```json
94-
{
95-
"model": "Qwen/Qwen3-32B",
96-
"endpointUrl": "http://localhost:1234/v1",
97-
"servers": [
98-
{
99-
"type": "stdio",
100-
"config": {
101-
"command": "npx",
102-
"args": ["@playwright/mcp@latest"]
103-
}
104-
}
105-
]
106-
}
107-
108-
```
109-
110-
Optionally, add a `PROMPT.md` to customize the system prompt.
111-
11282
**Advanced Usage**
11383
In addition to the CLI, you can use the `Agent` class for more fine-grained control. For lower-level interactions, use the `MCPClient` from the `@huggingface/mcp-client` package to connect directly to MCP servers and manage tool calls.
11484

11585
Learn more about tiny-agents in the [huggingface.js documentation](https://huggingface.co/docs/huggingface.js/en/tiny-agents/README).
11686

117-
## huggingface_hub (Python)
87+
### huggingface_hub (Python)
11888

11989
The `huggingface_hub` library is the easiest way to run MCP-powered agents in Python. It includes a high-level `tiny-agents` CLI as well as programmatic access via the `Agent` and `MCPClient` classes — all built to work with [Hugging Face Inference Providers](https://huggingface.co/docs/inference-providers/index), local LLMs, or any inference endpoint compatible with OpenAI's API specs.
12090

@@ -151,6 +121,49 @@ For more fine-grained control, use the `MCPClient` directly. This low-level inte
151121

152122
Learn more in the [`huggingface_hub` MCP documentation](https://huggingface.co/docs/huggingface_hub/main/en/package_reference/mcp).
153123

124+
125+
### Custom Agents
126+
127+
To create your own agent, simply create a folder (e.g., `my-agent/`) and define your agent’s configuration in an `agent.json` file.
128+
The following example shows a web-browsing agent configured to use the [Qwen/Qwen2.5-72B-Instruct](https://huggingface.co/Qwen/Qwen2.5-72B-Instruct) model via Nebius inference provider, and it comes equipped with a playwright MCP server, which lets it use a web browser
129+
130+
```json
131+
{
132+
"model": "Qwen/Qwen2.5-72B-Instruct",
133+
"provider": "nebius",
134+
"servers": [
135+
{
136+
"type": "stdio",
137+
"config": {
138+
"command": "npx",
139+
"args": ["@playwright/mcp@latest"]
140+
}
141+
}
142+
]
143+
}
144+
```
145+
146+
To use a local LLM (such as [llama.cpp](https://github.com/ggerganov/llama.cpp), or [LM Studio](https://lmstudio.ai/)), just provide an `endpointUrl`:
147+
148+
```json
149+
{
150+
"model": "Qwen/Qwen3-32B",
151+
"endpointUrl": "http://localhost:1234/v1",
152+
"servers": [
153+
{
154+
"type": "stdio",
155+
"config": {
156+
"command": "npx",
157+
"args": ["@playwright/mcp@latest"]
158+
}
159+
}
160+
]
161+
}
162+
163+
```
164+
165+
Optionally, add a `PROMPT.md` to customize the system prompt.
166+
154167
<Tip>
155168

156169
Don't hesitate to contribute your agent to the community by opening a Pull Request in the [tiny-agents](https://huggingface.co/datasets/tiny-agents/tiny-agents) Hugging Face dataset.

0 commit comments

Comments
 (0)