Skip to content

Commit c631e6c

Browse files
authored
Agents landing page
1 parent 64ba356 commit c631e6c

File tree

2 files changed

+66
-0
lines changed

2 files changed

+66
-0
lines changed

docs/hub/_toctree.yml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -63,6 +63,8 @@
6363
sections:
6464
- local: adapters
6565
title: Adapters
66+
- local: agents
67+
title: Agents on Hub
6668
- local: allennlp
6769
title: AllenNLP
6870
- local: bertopic

docs/hub/agents.md

Lines changed: 64 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,64 @@
1+
# Agents on Hub
2+
3+
This page compiles all the libraries and tools Hugging Face offers for agentic workflows.
4+
5+
## Smolagents
6+
7+
[Smolagents](https://github.com/huggingface/smolagents) is a lightweight library to cover all agentic use cases from code writing agents to computer use in few lines of code. It is model agnostic, supporting local models served with Hugging Face Transformers, as well as models offered with [Inference Providers](../inference-providers/index.md), and proprietary model providers.
8+
9+
It offers three agent classes based on ReAct framework: `CodeAgent` for agents writing their own codes, `ToolCallingAgent` for tool calling agents and the `MultiStepAgent` which the former two agents are based on for multi-step ReAct workflows.
10+
11+
If you want to avoid defining agents yourself, easiest way to start an agent is through CLI, with `smolagent` command.
12+
13+
```python
14+
smolagent "Plan a trip to Tokyo, Kyoto and Osaka between Mar 28 and Apr 7." --model-type "InferenceClientModel" --model-id "Qwen/Qwen2.5-Coder-32B-Instruct" --imports "pandas numpy" --tools "web_search"
15+
```
16+
17+
Agents can be pushed to Hugging Face Hub as Spaces. Check out all the cool agents people have built [here](https://huggingface.co/spaces?filter=smolagents&sort=likes).
18+
19+
## huggingface.js mcp-client
20+
21+
Huggingface.js offers an MCP client served with Inference Providers. Getting started with them is as simple as running `pnpm agent`. You can plug and play different models and providers by setting `PROVIDER` and `MODEL_ID` environmental variables.
22+
23+
```bash
24+
export HF_TOKEN="hf_..."
25+
export MODEL_ID="Qwen/Qwen2.5-72B-Instruct"
26+
export PROVIDER="nebius"
27+
pnpm agent
28+
```
29+
30+
You can get more information about mcp-client [here](https://huggingface.co/docs/huggingface.js/en/mcp-client/README).
31+
32+
33+
34+
## Gradio MCP Client
35+
36+
Gradio MCP Client wraps Gradio applications to make them available for LLM to use.
37+
38+
To make a Gradio application an MCP server, simply pass in `mcp_server=True` when launching your demo like follows.
39+
40+
```python
41+
demo = gr.Interface(
42+
fn=generate_images,
43+
inputs="text",
44+
outputs="image",
45+
title="Image Generator"
46+
)
47+
48+
demo.launch(mcp_server=True)
49+
```
50+
51+
The server will be available at `http://your-server:port/gradio_api/mcp/sse` where your application is served. Lastly, add this to the settings of the MCP Client of your choice.
52+
53+
```
54+
{
55+
"mcpServers": {
56+
"gradio": {
57+
"url": "http://your-server:port/gradio_api/mcp/sse"
58+
}
59+
}
60+
}
61+
```
62+
63+
64+
This is very powerful because it let's the LLM use any Gradio application as a tool. You can find a variety of them on [Spaces](https://huggingface.co/spaces) and serve.

0 commit comments

Comments
 (0)