|
2 | 2 |
|
3 | 3 | This page compiles all the libraries and tools Hugging Face offers for agentic workflows: huggingface.js mcp-client, Gradio MCP Server and smolagents. |
4 | 4 |
|
| 5 | +## smolagents |
| 6 | + |
| 7 | +[smolagents](https://github.com/huggingface/smolagents) is a lightweight library to cover all agentic use cases, from code-writing agents to computer use, in few lines of code. It is model agnostic, supporting local models served with Hugging Face Transformers, as well as models offered with [Inference Providers](../inference-providers/index.md), and proprietary model providers. |
| 8 | + |
| 9 | +It offers a unique kind of agent :`CodeAgent`, an agent that writes its actions in Python code. |
| 10 | +It also supports the standard agent that writes actions in JSON blobs as most other agentic frameworks do, called `ToolCallingAgent`. |
| 11 | +To learn more about write actions in code vs JSON, check out our [new short course on DeepLearning.AI](https://www.deeplearning.ai/short-courses/building-code-agents-with-hugging-face-smolagents/). |
| 12 | + |
| 13 | +If you want to avoid defining agents yourself, the easiest way to start an agent is through the CLI, using the `smolagent` command. |
| 14 | + |
| 15 | +```bash |
| 16 | +smolagent "Plan a trip to Tokyo, Kyoto and Osaka between Mar 28 and Apr 7." \ |
| 17 | +--model-type "InferenceClientModel" \ |
| 18 | +--model-id "Qwen/Qwen2.5-Coder-32B-Instruct" \ |
| 19 | +--imports "pandas numpy" \ |
| 20 | +--tools "web_search" |
| 21 | +``` |
| 22 | + |
| 23 | +Agents can be pushed to Hugging Face Hub as Spaces. Check out all the cool agents people have built [here](https://huggingface.co/spaces?filter=smolagents&sort=likes). |
| 24 | + |
| 25 | +smolagents also supports MCP servers as tools, as follows: |
| 26 | + |
| 27 | +```python |
| 28 | +# pip install --upgrade smolagents mcp |
| 29 | +from smolagents import MCPClient, CodeAgent |
| 30 | +from mcp import StdioServerParameters |
| 31 | +import os |
| 32 | + |
| 33 | +server_parameters = StdioServerParameters( |
| 34 | + command="uvx", # Using uvx ensures dependencies are available |
| 35 | + args=[ "--quiet", "[email protected]"], |
| 36 | + env={"UV_PYTHON": "3.12", **os.environ}, |
| 37 | +) |
| 38 | + |
| 39 | +with MCPClient(server_parameters) as tools: |
| 40 | + agent = CodeAgent(tools=tools, model=model, add_base_tools=True) |
| 41 | + agent.run("Please find the latest research on COVID-19 treatment.") |
| 42 | +``` |
| 43 | + |
| 44 | +Learn more [in the documentation](https://huggingface.co/docs/smolagents/tutorials/tools#use-mcp-tools-with-mcpclient-directly). |
5 | 45 | ## huggingface.js mcp-client |
6 | 46 |
|
7 | 47 | Huggingface.js offers an MCP client served with [Inference Providers](https://huggingface.co/docs/inference-providers/en/index) or local LLMs. Getting started with them is as simple as running `pnpm agent`. You can plug and play different models and providers by setting `PROVIDER` and `MODEL_ID` environment variables. |
@@ -70,43 +110,3 @@ Lastly, add this to the settings of the MCP Client of your choice (e.g. Cursor). |
70 | 110 |
|
71 | 111 | This is very powerful because it lets the LLM use any Gradio application as a tool. You can find thousands of them on [Spaces](https://huggingface.co/spaces). Learn more [here](https://www.gradio.app/guides/building-mcp-server-with-gradio). |
72 | 112 |
|
73 | | -## smolagents |
74 | | - |
75 | | -[smolagents](https://github.com/huggingface/smolagents) is a lightweight library to cover all agentic use cases, from code-writing agents to computer use, in few lines of code. It is model agnostic, supporting local models served with Hugging Face Transformers, as well as models offered with [Inference Providers](../inference-providers/index.md), and proprietary model providers. |
76 | | - |
77 | | -It offers a unique kind of agent :`CodeAgent`, an agent that writes its actions in Python code. |
78 | | -It also supports the standard agent that writes actions in JSON blobs as most other agentic frameworks do, called `ToolCallingAgent`. |
79 | | -To learn more about write actions in code vs JSON, check out our [new short course on DeepLearning.AI](https://www.deeplearning.ai/short-courses/building-code-agents-with-hugging-face-smolagents/). |
80 | | - |
81 | | -If you want to avoid defining agents yourself, the easiest way to start an agent is through the CLI, using the `smolagent` command. |
82 | | - |
83 | | -```bash |
84 | | -smolagent "Plan a trip to Tokyo, Kyoto and Osaka between Mar 28 and Apr 7." \ |
85 | | ---model-type "InferenceClientModel" \ |
86 | | ---model-id "Qwen/Qwen2.5-Coder-32B-Instruct" \ |
87 | | ---imports "pandas numpy" \ |
88 | | ---tools "web_search" |
89 | | -``` |
90 | | - |
91 | | -Agents can be pushed to Hugging Face Hub as Spaces. Check out all the cool agents people have built [here](https://huggingface.co/spaces?filter=smolagents&sort=likes). |
92 | | - |
93 | | -smolagents also supports MCP servers as tools, as follows: |
94 | | - |
95 | | -```python |
96 | | -# pip install --upgrade smolagents mcp |
97 | | -from smolagents import MCPClient, CodeAgent |
98 | | -from mcp import StdioServerParameters |
99 | | -import os |
100 | | - |
101 | | -server_parameters = StdioServerParameters( |
102 | | - command="uvx", # Using uvx ensures dependencies are available |
103 | | - args=[ "--quiet", "[email protected]"], |
104 | | - env={"UV_PYTHON": "3.12", **os.environ}, |
105 | | -) |
106 | | - |
107 | | -with MCPClient(server_parameters) as tools: |
108 | | - agent = CodeAgent(tools=tools, model=model, add_base_tools=True) |
109 | | - agent.run("Please find the latest research on COVID-19 treatment.") |
110 | | -``` |
111 | | - |
112 | | -Learn more [in the documentation](https://huggingface.co/docs/smolagents/tutorials/tools#use-mcp-tools-with-mcpclient-directly). |
0 commit comments