You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: units/en/unit2/tiny-agents.mdx
+20-49Lines changed: 20 additions & 49 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -17,52 +17,45 @@ Some MCP Clients, notably Claude Desktop, do not yet support SSE-based MCP Serve
17
17
18
18
</Tip>
19
19
20
-
<hfoptionsid="tiny-agents">
21
-
<hfoptionid="typescript">
20
+
Tiny Agent can run MCP servers with a command line environment. To do this, we will need to install `npm` and run the server with `npx`. **We'll need these for both Python and JavaScript.**
22
21
23
-
First, we need to install the `tiny-agents` package.
22
+
Let's install `npx` with `npm`. If you don't have `npm` installed, check out the [npm documentation](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm).
24
23
25
24
```bash
26
-
npm install @huggingface/tiny-agents
27
-
# or
28
-
pnpm add @huggingface/tiny-agents
25
+
# install npx
26
+
npm install -g npx
29
27
```
30
28
31
29
Then, we need to install the `mcp-remote` package.
32
30
33
31
```bash
34
32
npm i mcp-remote
35
-
# or
36
-
pnpm add mcp-remote
37
33
```
38
34
39
-
</hfoption>
40
-
<hfoptionid="python">
35
+
<hfoptionsid="tiny-agents">
36
+
<hfoptionid="typescript">
41
37
42
-
First, you need to install the latest version of `huggingface_hub` with the `mcp` extra to get all the necessary components.
38
+
For JavaScript, we need to install the `tiny-agents` package.
43
39
44
40
```bash
45
-
pip install "huggingface_hub[mcp]>=0.32.0"
41
+
npm install @huggingface/tiny-agents
46
42
```
47
43
48
-
Then, we need to install the `mcp-remote` package.
49
-
50
-
```bash
51
-
npm install mcp-remote
52
-
```
44
+
</hfoption>
45
+
<hfoptionid="python">
53
46
54
-
And we'll need to install `npx` to run the `mcp-remote` command.
47
+
For Python, you need to install the latest version of `huggingface_hub` with the `mcp` extra to get all the necessary components.
55
48
56
49
```bash
57
-
npm install -g npx
50
+
pip install "huggingface_hub[mcp]>=0.32.0"
58
51
```
59
52
60
53
</hfoption>
61
54
</hfoptions>
62
55
63
56
## Tiny Agents MCP Client in the Command Line
64
57
65
-
Tiny Agents can create MCP clients from the command line based on JSON configuration files.
58
+
Let's repeat the example from [Unit 1](../unit1/mcp-clients.mdx) to create a basic Tiny Agent. Tiny Agents can create MCP clients from the command line based on JSON configuration files.
66
59
67
60
<hfoptionsid="tiny-agents">
68
61
<hfoptionid="typescript">
@@ -87,7 +80,7 @@ The JSON file will look like this:
87
80
"command": "npx",
88
81
"args": [
89
82
"mcp-remote",
90
-
"http://localhost:7860/gradio_api/mcp/sse"
83
+
"http://localhost:7860/gradio_api/mcp/sse"// This is the MCP Server we created in the previous section
91
84
]
92
85
}
93
86
}
@@ -109,6 +102,7 @@ Let's setup a project with a basic Tiny Agent.
109
102
```bash
110
103
mkdir my-agent
111
104
touch my-agent/agent.json
105
+
cd my-agent
112
106
```
113
107
114
108
The JSON file will look like this:
@@ -135,7 +129,7 @@ The JSON file will look like this:
135
129
We can then run the agent with the following command:
136
130
137
131
```bash
138
-
tiny-agents run ./my-agent
132
+
tiny-agents run agent.json
139
133
```
140
134
141
135
</hfoption>
@@ -149,10 +143,9 @@ Here we have a basic Tiny Agent that can connect to our Gradio MCP server. It in
149
143
|`provider`| The inference provider to use for the agent |
150
144
|`servers`| The servers to use for the agent. We'll use the `mcp-remote` server for our Gradio MCP server. |
151
145
152
-
We could also use an open source model running locally with Tiny Agents.
146
+
<Tip>
153
147
154
-
<hfoptionsid="tiny-agents">
155
-
<hfoptionid="typescript">
148
+
We could also use an open source model running locally with Tiny Agents. If we start a local inference server with
156
149
157
150
```json
158
151
{
@@ -173,33 +166,11 @@ We could also use an open source model running locally with Tiny Agents.
173
166
}
174
167
```
175
168
176
-
</hfoption>
177
-
<hfoptionid="python">
178
-
179
-
```json
180
-
{
181
-
"model": "Qwen/Qwen3-32B",
182
-
"endpoint_url": "http://localhost:1234/v1",
183
-
"servers": [
184
-
{
185
-
"type": "stdio",
186
-
"config": {
187
-
"command": "npx",
188
-
"args": [
189
-
"mcp-remote",
190
-
"http://localhost:1234/v1/mcp/sse"
191
-
]
192
-
}
193
-
}
194
-
]
195
-
}
196
-
```
197
-
198
-
</hfoption>
199
-
</hfoptions>
200
169
201
170
Here we have a Tiny Agent that can connect to a local model. It includes a model, endpoint URL (`http://localhost:1234/v1`), and a server configuration. The endpoint should be an OpenAI-compatible endpoint.
202
171
172
+
</Tip>
173
+
203
174
## Custom Tiny Agents MCP Client
204
175
205
176
Now that we understand both Tiny Agents and Gradio MCP servers, let's see how they work together! The beauty of MCP is that it provides a standardized way for agents to interact with any MCP-compatible server, including our Gradio-based sentiment analysis server from earlier sections.
0 commit comments