Skip to content
Merged
Show file tree
Hide file tree
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
42 changes: 21 additions & 21 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ This is a collection of JS libraries to interact with the Hugging Face API, with

- [@huggingface/inference](packages/inference/README.md): Use all supported (serverless) Inference Providers or switch to Inference Endpoints (dedicated) to make calls to 100,000+ Machine Learning models
- [@huggingface/hub](packages/hub/README.md): Interact with huggingface.co to create or delete repos and commit / download files
- [@huggingface/agents](packages/agents/README.md): Interact with HF models through a natural language interface
- [@huggingface/mcp-client](packages/mcp-client/README.md): A Model Context Protocol (MCP) client, and a tiny Agent library, built on top of InferenceClient.
- [@huggingface/gguf](packages/gguf/README.md): A GGUF parser that works on remotely hosted files.
- [@huggingface/dduf](packages/dduf/README.md): Similar package for DDUF (DDUF Diffusers Unified Format)
- [@huggingface/tasks](packages/tasks/README.md): The definition files and source-of-truth for the Hub's main primitives like pipeline tasks, model libraries, etc.
Expand All @@ -79,15 +79,15 @@ To install via NPM, you can download the libraries as needed:
```bash
npm install @huggingface/inference
npm install @huggingface/hub
npm install @huggingface/agents
npm install @huggingface/mcp-client
```

Then import the libraries in your code:

```ts
import { InferenceClient } from "@huggingface/inference";
import { HfAgent } from "@huggingface/agents";
import { createRepo, commit, deleteRepo, listFiles } from "@huggingface/hub";
import { McpClient } from "@huggingface/mcp-client";
import type { RepoId } from "@huggingface/hub";
```

Expand All @@ -107,12 +107,10 @@ You can run our packages with vanilla JS, without any bundler, by using a CDN or
```ts
// esm.sh
import { InferenceClient } from "https://esm.sh/@huggingface/inference"
import { HfAgent } from "https://esm.sh/@huggingface/agents";

import { createRepo, commit, deleteRepo, listFiles } from "https://esm.sh/@huggingface/hub"
// or npm:
import { InferenceClient } from "npm:@huggingface/inference"
import { HfAgent } from "npm:@huggingface/agents";

import { createRepo, commit, deleteRepo, listFiles } from "npm:@huggingface/hub"
```
Expand Down Expand Up @@ -223,29 +221,31 @@ await deleteFiles({
});
```

### @huggingface/agents example
### @huggingface/mcp-client example

```ts
import { HfAgent, LLMFromHub, defaultTools } from '@huggingface/agents';
import { Agent } from '@huggingface/mcp-client';

const HF_TOKEN = "hf_...";

const agent = new HfAgent(
HF_TOKEN,
LLMFromHub(HF_TOKEN),
[...defaultTools]
);

const agent = new Agent({
provider: "auto",
model: "Qwen/Qwen2.5-72B-Instruct",
apiKey: HF_TOKEN,
servers: [
{
// Playwright MCP
command: "npx",
args: ["@playwright/mcp@latest"],
},
],
});

// you can generate the code, inspect it and then run it
const code = await agent.generateCode("Draw a picture of a cat wearing a top hat. Then caption the picture and read it out loud.");
console.log(code);
const messages = await agent.evaluateCode(code)
console.log(messages); // contains the data

// or you can run the code directly, however you can't check that the code is safe to execute this way, use at your own risk.
const messages = await agent.run("Draw a picture of a cat wearing a top hat. Then caption the picture and read it out loud.")
console.log(messages);
await agent.loadTools();
for await (const chunk of agent.run("Draw a picture of a cat wearing a top hat. Then caption the picture and read it out loud.")) {
console.log(chunk);
}
```

There are more features of course, check each library's README!
Expand Down
4 changes: 4 additions & 0 deletions packages/agents/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,10 @@

A way to call Hugging Face models and Inference Endpoints from natural language, using an LLM.

> [!WARNING]
> `@huggingface/agents` is now deprecated, and a modern version, built on top of MCP, is [Tiny Agents](https://github.com/huggingface/huggingface.js/tree/main/packages/mcp-client).
> Go checkout the `Tiny Agents` introduction blog [here](https://huggingface.co/blog/tiny-agents).

## Install

```console
Expand Down