Skip to content

Commit f60a851

Browse files
julien-cWauplin
andauthored
Deprecate old agents in favor of mcp-client (huggingface#1447)
i've also run this ```bash npm deprecate @huggingface/agents@"*" "This package is no longer maintained. Please use @huggingface/mcp-client instead." ``` leading to this: <img width="1801" alt="image" src="https://github.com/user-attachments/assets/c8b11a40-d508-45d2-9c44-b261faa05795" /> (https://www.npmjs.com/package/@huggingface/agents) --------- Co-authored-by: Lucain <[email protected]>
1 parent fd649bd commit f60a851

File tree

2 files changed

+30
-21
lines changed

2 files changed

+30
-21
lines changed

README.md

Lines changed: 26 additions & 21 deletions
Original file line numberDiff line numberDiff line change
@@ -57,7 +57,7 @@ This is a collection of JS libraries to interact with the Hugging Face API, with
5757

5858
- [@huggingface/inference](packages/inference/README.md): Use all supported (serverless) Inference Providers or switch to Inference Endpoints (dedicated) to make calls to 100,000+ Machine Learning models
5959
- [@huggingface/hub](packages/hub/README.md): Interact with huggingface.co to create or delete repos and commit / download files
60-
- [@huggingface/agents](packages/agents/README.md): Interact with HF models through a natural language interface
60+
- [@huggingface/mcp-client](packages/mcp-client/README.md): A Model Context Protocol (MCP) client, and a tiny Agent library, built on top of InferenceClient.
6161
- [@huggingface/gguf](packages/gguf/README.md): A GGUF parser that works on remotely hosted files.
6262
- [@huggingface/dduf](packages/dduf/README.md): Similar package for DDUF (DDUF Diffusers Unified Format)
6363
- [@huggingface/tasks](packages/tasks/README.md): The definition files and source-of-truth for the Hub's main primitives like pipeline tasks, model libraries, etc.
@@ -79,15 +79,15 @@ To install via NPM, you can download the libraries as needed:
7979
```bash
8080
npm install @huggingface/inference
8181
npm install @huggingface/hub
82-
npm install @huggingface/agents
82+
npm install @huggingface/mcp-client
8383
```
8484

8585
Then import the libraries in your code:
8686

8787
```ts
8888
import { InferenceClient } from "@huggingface/inference";
89-
import { HfAgent } from "@huggingface/agents";
9089
import { createRepo, commit, deleteRepo, listFiles } from "@huggingface/hub";
90+
import { McpClient } from "@huggingface/mcp-client";
9191
import type { RepoId } from "@huggingface/hub";
9292
```
9393

@@ -107,12 +107,10 @@ You can run our packages with vanilla JS, without any bundler, by using a CDN or
107107
```ts
108108
// esm.sh
109109
import { InferenceClient } from "https://esm.sh/@huggingface/inference"
110-
import { HfAgent } from "https://esm.sh/@huggingface/agents";
111110

112111
import { createRepo, commit, deleteRepo, listFiles } from "https://esm.sh/@huggingface/hub"
113112
// or npm:
114113
import { InferenceClient } from "npm:@huggingface/inference"
115-
import { HfAgent } from "npm:@huggingface/agents";
116114

117115
import { createRepo, commit, deleteRepo, listFiles } from "npm:@huggingface/hub"
118116
```
@@ -223,29 +221,36 @@ await deleteFiles({
223221
});
224222
```
225223

226-
### @huggingface/agents example
224+
### @huggingface/mcp-client example
227225

228226
```ts
229-
import { HfAgent, LLMFromHub, defaultTools } from '@huggingface/agents';
227+
import { Agent } from '@huggingface/mcp-client';
230228

231229
const HF_TOKEN = "hf_...";
232230

233-
const agent = new HfAgent(
234-
HF_TOKEN,
235-
LLMFromHub(HF_TOKEN),
236-
[...defaultTools]
237-
);
238-
231+
const agent = new Agent({
232+
provider: "auto",
233+
model: "Qwen/Qwen2.5-72B-Instruct",
234+
apiKey: HF_TOKEN,
235+
servers: [
236+
{
237+
// Playwright MCP
238+
command: "npx",
239+
args: ["@playwright/mcp@latest"],
240+
},
241+
],
242+
});
239243

240-
// you can generate the code, inspect it and then run it
241-
const code = await agent.generateCode("Draw a picture of a cat wearing a top hat. Then caption the picture and read it out loud.");
242-
console.log(code);
243-
const messages = await agent.evaluateCode(code)
244-
console.log(messages); // contains the data
245244

246-
// or you can run the code directly, however you can't check that the code is safe to execute this way, use at your own risk.
247-
const messages = await agent.run("Draw a picture of a cat wearing a top hat. Then caption the picture and read it out loud.")
248-
console.log(messages);
245+
await agent.loadTools();
246+
for await (const chunk of agent.run("What are the top 5 trending models on Hugging Face?")) {
247+
if ("choices" in chunk) {
248+
const delta = chunk.choices[0]?.delta;
249+
if (delta.content) {
250+
console.log(delta.content);
251+
}
252+
}
253+
}
249254
```
250255

251256
There are more features of course, check each library's README!

packages/agents/README.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,10 @@
22

33
A way to call Hugging Face models and Inference Endpoints from natural language, using an LLM.
44

5+
> [!WARNING]
6+
> `@huggingface/agents` is now deprecated, and a modern version, built on top of MCP, is [Tiny Agents](https://github.com/huggingface/huggingface.js/tree/main/packages/mcp-client).
7+
> Go checkout the `Tiny Agents` introduction blog [here](https://huggingface.co/blog/tiny-agents).
8+
59
## Install
610

711
```console

0 commit comments

Comments
 (0)