Skip to content
Open
1 change: 1 addition & 0 deletions docs/hub/_redirects.yml
Original file line number Diff line number Diff line change
Expand Up @@ -22,3 +22,4 @@ datasets-viewer: data-studio
xet: xet/index
storage-backends: xet/index
git-xet: xet/using-xet-storage#git-xet
hf-mcp-server: agents/mcp
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
hf-mcp-server: agents/mcp
hf-mcp-server: agents-mcp

17 changes: 15 additions & 2 deletions docs/hub/_toctree.yml
Original file line number Diff line number Diff line change
Expand Up @@ -399,6 +399,21 @@
- local: spaces-get-user-plan
title: Get User Plan and Status

- local: agents
title: Agents
isExpanded: true
sections:
- local: agents-overview
title: Agents on the Hub
- local: agents-mcp
title: Hugging Face MCP Server
- local: agents-skills
title: Hugging Face Agent Skills
- local: agents-cli
title: Agents and the `hf` CLI
- local: agents-sdk
title: Building agents with the SDK

- local: jobs
title: Jobs
isExpanded: true
Expand Down Expand Up @@ -481,8 +496,6 @@
title: "JFrog"
- local: agents
title: Agents on Hub
- local: hf-mcp-server
title: Hugging Face MCP Server
- local: moderation
title: Moderation
- local: paper-pages
Expand Down
91 changes: 91 additions & 0 deletions docs/hub/agents-cli.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,91 @@
# Hugging Face CLI for Agents
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
# Hugging Face CLI for Agents
# Agents and the `hf` CLI

If we go with what's currently in the toc


Coding agents like Claude Code, OpenAI Codex, or Open Code are excellent at using the CLI to interact with the Hub through the `hf` command-line interface. Search for models, datasets, Spaces, and papers. Download models, upload files, manage repositories, and run compute jobs.

> [!TIP]
> This is a quick guide on agents that use the CLI. For more detailed information, see the [CLI Reference itself](https://huggingface.co/docs/huggingface_hub/guides/cli).
## Skills

Hugging Face Skills are available for the CLI to help your agents interact with the Hub. Skills give agents relevant instructions for how to use the CLI. See the [Skills Guide](./agents-skills) for available skills and usage. That said, most agents can get by using the CLI directly without Skills. Worst case scenario, they will rely on documentation and trial and error to get commands right. Skills will make your agents more efficient and productive.

## Installation

Make sure the `hf` CLI is installed on your system.

### Standalone Installer (Recommended)

<hfoptions id="cli-install">

<hfoption id="macOS / Linux">

```bash
curl -LsSf https://hf.co/cli/install.sh | bash
```

</hfoption>

<hfoption id="Windows">

```powershell
powershell -ExecutionPolicy ByPass -c "irm https://hf.co/cli/install.ps1 | iex"
```

</hfoption>

</hfoptions>

### Alternative Methods

```bash
# Using pip
pip install -U huggingface_hub

# Using Homebrew (macOS)
brew install huggingface-cli

# Using uvx (no install needed)
uvx hf --help
```

### Verify Installation

```bash
hf --help
```

## Hugging Face Skills for the CLI

Hugging Face Skills are available for the CLI to help you interact with the Hub. Skills give agents relevant instructions for how to use the CLI. See the [Skills Guide](./skills) for available skills and usage.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Hugging Face Skills are available for the CLI to help you interact with the Hub. Skills give agents relevant instructions for how to use the CLI. See the [Skills Guide](./skills) for available skills and usage.
Hugging Face Skills are available for the CLI to help you interact with the Hub. Skills give agents relevant instructions for how to use the CLI. See the [Skills Guide](./agents-skills) for available skills and usage.


```bash
# start claude
claude

# install the skills marketplace plugin
/plugin marketplace add huggingface/skills

# install the hugging face cli skill
/plugin install hugging-face-cli@huggingface/skills
```

With Skills installed, your agent can use the CLI to interact with the Hub.

For example, you could use Claude Code to search for datasets:

```
"What datasets are available for sentiment analysis?"
```

Or, you could use OpenAI Codex to create pull requests:

```
"Open a PR with evaluation results from the results.csv file to my/my-model repo on the Hub."
```

## Resources

- [CLI Reference](https://huggingface.co/docs/huggingface_hub/guides/cli) - Complete command documentation
- [Token Settings](https://huggingface.co/settings/tokens) - Manage your tokens
- [Jobs Documentation](https://huggingface.co/docs/huggingface_hub/guides/cli#hf-jobs) - Compute jobs guide

56 changes: 56 additions & 0 deletions docs/hub/agents-mcp.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
# Hugging Face MCP Server

The Hugging Face MCP (Model Context Protocol) Server connects your MCP‑compatible AI assistant (for example Codex, Cursor, VS Code extensions, Zed, ChatGPT or Claude Desktop) directly to the Hugging Face Hub. Once connected, your assistant can search and explore Hub resources and use community tools, all from within your editor, chat or CLI.

## What you can do

- Search and explore Hub resources: models, datasets, Spaces, and papers.
- Run community tools via MCP‑compatible Gradio apps hosted on [Spaces](https://hf.co/spaces).
- Bring results back into your assistant with metadata, links, and context.

## Get started

1. Open your [MCP settings](https://huggingface.co/settings/mcp) while logged in.

2. Pick your client: select your MCP‑compatible client (for example Cursor, VS Code, Zed, Claude Desktop). The page shows client‑specific instructions and a ready‑to‑copy configuration snippet.

3. Paste and restart: copy the snippet into your client’s MCP configuration, save, and restart/reload the client. You should see “Hugging Face” (or similar) listed as a connected MCP server in your client.

> [!TIP]
> The settings page generates the exact configuration your client expects. Use it rather than writing config by hand.

![MCP Settings Example](https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/hf-mcp-settings.png)

## Using the server

After connecting, ask your assistant to use the Hugging Face tools. Example prompts:

- “Search Hugging Face models for Qwen 3 Quantizations.”
- “Find a Space that can transcribe audio files.”
- “Show datasets about weather time‑series.”
- “Create a 1024 x 1024 image of a cat ghibli style.”

Your assistant will call MCP tools exposed by the Hugging Face MCP Server (including Spaces you have selected, as shown in the next section) and return results (titles, owners, downloads, links, and so on). You can then open the resource on the Hub or continue iterating in the same chat.

![HF MCP with Spaces in VS Code](https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/hf-mcp-vscode.png)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd add docs about experimental options too, and when they're useful, wdyt?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry. I don't follow which experimental options?

## Add community tools (Spaces)

You can extend your setup with MCP‑compatible Gradio spaces built by the community:

- Explore Spaces with MCP support [here](https://huggingface.co/spaces?filter=mcp-server).
- Add the relevant space in your MCP settings on Hugging Face [here](https://huggingface.co/settings/mcp).

Gradio MCP apps expose their functions as tools (with arguments and descriptions) so your assistant can call them directly. Please, restart or refresh your client so it picks up new tools you add.

![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f17f0a0925b9863e28ad517/ex9KRpvamn84ZaOlSp_Bj.png)

Check out our dedicated guide for Spaces as MCP servers [here](https://huggingface.co/docs/hub/spaces-mcp-servers#add-an-existing-space-to-your-mcp-tools).

## Learn more

- Settings and client setup: https://huggingface.co/settings/mcp
- Changelog announcement: https://huggingface.co/changelog/hf-mcp-server
- Hugging Face MCP Server: https://huggingface.co/mcp
- Build your own MCP Server with Gradio Spaces: https://www.gradio.app/guides/building-mcp-server-with-gradio

184 changes: 184 additions & 0 deletions docs/hub/agents-overview.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,184 @@
# Agents on the Hub

Hugging Face provides tools and protocols that connect AI agents directly to the Hub. Whether you're chatting with Claude, building with Codex, or developing custom agents, you can access models, datasets, Spaces, and community tools. This page covers connecting your [chat agents](#chat-with-hugging-face) and [coding agents](#coding-agents) to the Hub.

> [!TIP]
> To build with agents on the Hub, check out the pages on [MCP Server](./agents-mcp), [Skills](./agents-skills), [CLI](./agents-cli), and [SDK](./agents-sdk).

## Chat with Hugging Face

Connect your AI assistant directly to the Hugging Face Hub using the Model Context Protocol (MCP). Once connected, you can search models, explore datasets, generate images, and use community tools—all from within your chat interface.

### Supported Assistants

The HF MCP Server works with any MCP-compatible client:
- **ChatGPT** (via plugins)
- **Claude Desktop**
- **Custom MCP clients**

### Setup

#### 1. Open MCP Settings

![MCP Settings Example](https://huggingface.co/huggingface/documentation-images/resolve/main/agents-docs/mcp-settings.png)

Visit [huggingface.co/settings/mcp](https://huggingface.co/settings/mcp) while logged in.

#### 2. Select Your Client

Choose your MCP-compatible client from the list. The page shows client-specific instructions and a ready-to-copy configuration snippet.

#### 3. Configure and Restart

Copy the configuration snippet into your client's MCP settings, save, and restart your client.

> [!TIP]
> The settings page generates the exact configuration your client expects. Use it rather than writing config by hand.

### What You Can Do

Once connected, ask your assistant to use Hugging Face tools among the ones you selected in your configuration:

| Task | Example Prompt |
| ---- | -------------- |
| Search models | "Find Qwen 3 quantizations on Hugging Face" |
| Explore datasets | "Show datasets about weather time-series" |
| Find Spaces | "Find a Space that can transcribe audio files" |
| Generate images | "Create a 1024x1024 image of a cat in Ghibli style" |
| Search papers | "Find recent papers on vision-language models" |

Your assistant calls MCP tools exposed by the Hugging Face server and returns results with metadata, links, and context.

### Add Community Tools

Extend your setup with MCP-compatible Gradio Spaces:

1. Browse [Spaces with MCP support](https://huggingface.co/spaces?filter=mcp-server)
2. Add them in your [MCP settings](https://huggingface.co/settings/mcp)
3. Restart your client to pick up new tools

Gradio MCP apps expose their functions as tools with arguments and descriptions, so your assistant can call them directly.

### Learn More

- [MCP Server Guide](./agents-mcp) - Detailed setup and configuration
- [HF MCP Settings](https://huggingface.co/settings/mcp) - Configure your client
- [MCP-compatible Spaces](https://huggingface.co/spaces?filter=mcp-server) - Community tools

## Coding Agents

Integrate Hugging Face into your coding workflow with the MCP Server and Skills. Access models, datasets, and ML tools directly from your IDE or coding agent. For example, we cover these coding agents and more with MCP and/or Skills:

| Coding Agent | Integration Method |
| ------------ | ------------------ |
| [Claude Code](https://code.claude.com/docs) | MCP Server + Skills |
| [OpenAI Codex](https://openai.com/codex/) | MCP Server + Skills |
| [Open Code](https://opencode.ai/) | MCP Server + Skills |
| [Cursor](https://www.cursor.com/) | MCP Server |
| [VS Code](https://code.visualstudio.com/) | MCP Server |
| [Gemini CLI](https://geminicli.com/) | MCP Server |
| [Zed](https://zed.dev/) | MCP Server |

### Quick Setup

#### MCP Server

The MCP Server gives your coding agent access to Hub search, Spaces, and community tools.

**Cursor / VS Code / Zed:**

1. Visit [huggingface.co/settings/mcp](https://huggingface.co/settings/mcp)
2. Select your IDE from the list
3. Copy the configuration snippet
4. Add it to your IDE's MCP settings
5. Restart the IDE

**Claude Code:**

```bash
claude mcp add hf-mcp-server -t http "https://huggingface.co/mcp?login"
```

#### Skills

Skills provide task-specific guidance for AI/ML workflows. They work alongside MCP or standalone.

```bash
# start claude
claude

# install the skills marketplace plugin
/plugin marketplace add huggingface/skills
```

Then, to install a Skill specification:
```bash
/plugin install hugging-face-cli@huggingface/skills
```

See the [Skills Guide](./agents-skills) for available skills and usage.

### What You Can Do

Once configured, your coding agent can:

| Capability | Example |
| ---------- | ------- |
| Search the Hub | "Find a code generation model under 7B parameters" |
| Generate images | "Create a diagram of a transformer architecture" |
| Explore datasets | "What datasets are available for sentiment analysis?" |
| Run Spaces | "Use the Whisper Space to transcribe this audio file" |
| Get documentation | "How do I fine-tune a model with transformers?" |

### Environment Configuration

#### Authentication

Set your Hugging Face token as an environment variable:

```bash
export HF_TOKEN="hf_..."
```

Or authenticate via the [CLI](./agents-cli):

```bash
hf auth login
```

#### Adding Community Tools

Extend your setup with MCP-compatible Gradio Spaces:

1. Browse [Spaces with MCP support](https://huggingface.co/spaces?filter=mcp-server)
2. Add them in your [MCP settings](https://huggingface.co/settings/mcp)
3. Restart your IDE

### Example Workflow

```text
You: Find a text classification model that works well on short texts

Agent: [Searches Hugging Face Hub]
Found several options:
- distilbert-base-uncased-finetuned-sst-2-english (sentiment)
- facebook/bart-large-mnli (zero-shot)
...

You: Show me how to use the first one

Agent: [Fetches documentation]
Here's how to use it with transformers:

from transformers import pipeline
classifier = pipeline("sentiment-analysis",
model="distilbert-base-uncased-finetuned-sst-2-english")
result = classifier("I love this product!")
```

## Next Steps

- [MCP Server](./agens-mcp) - Connect any MCP-compatible AI assistant to the Hub
- [Skills](./agens-skills) - Pre-built capabilities for coding agents
- [CLI](./agens-cli) - Command-line interface for Hub operations
- [SDK](./agens-sdk) - Python and JavaScript libraries for building agents
Loading