Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
19 changes: 19 additions & 0 deletions docs/hub/agents.md
Original file line number Diff line number Diff line change
@@ -1,10 +1,29 @@
# Agents on the Hub

This page compiles all the libraries and tools Hugging Face offers for agentic workflows:
- `HF MCP Server`: Connect your MCP-compatible AI assistant directly to the Hugging Face Hub.
- `tiny-agents`: A lightweight toolkit for MCP-powered agents, available in both JS (`@huggingface/tiny-agents`) and Python (`huggingface_hub`).
- `Gradio MCP Server`: Easily create MCP servers from Gradio apps and Spaces.
- `smolagents`: a Python library that enables you to run powerful agents in a few lines of code.

## HF MCP Server

The official **Hugging Face MCP (Model Context Protocol) Server** enables seamless integration between the Hugging Face Hub and any MCP-compatible AI assistant—including VSCode, Cursor, and Claude Desktop.

With the HF MCP Server, you can enhance your AI assistant's capabilities by connecting directly to the Hub's ecosystem. It comes with:
- a curated set of **built-in tools** like Spaces and Papers Semantic Search, Model and Dataset exploration, etc
- **MCP-compatible Gradio apps**: Connect to any [MCP-compatible Gradio app](https://huggingface.co/spaces?filter=mcp-server) built by the Hugging Face community

#### Getting Started

Visit [huggingface.co/settings/mcp](https://huggingface.co/settings/mcp) to configure your MCP client and get started.

<Tip warning={true}>

This feature is experimental ⚗️ and will continue to evolve.

</Tip>

## smolagents

[smolagents](https://github.com/huggingface/smolagents) is a lightweight library to cover all agentic use cases, from code-writing agents to computer use, in few lines of code. It is model agnostic, supporting local models served with Hugging Face Transformers, as well as models offered with [Inference Providers](../inference-providers/index.md), and proprietary model providers.
Expand Down