You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/hub/agents.md
+5-2Lines changed: 5 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -12,7 +12,7 @@ To learn more about write actions in code vs JSON, check out our [new short cour
12
12
13
13
If you want to avoid defining agents yourself, the easiest way to start an agent is through the CLI, using the `smolagent` command.
14
14
15
-
```
15
+
```bash
16
16
smolagent "Plan a trip to Tokyo, Kyoto and Osaka between Mar 28 and Apr 7." --model-type "InferenceClientModel" --model-id "Qwen/Qwen2.5-Coder-32B-Instruct" --imports "pandas numpy" --tools "web_search"
or, you can use any Local LLM (for example via lmstudio):
52
53
54
+
```bash
53
55
ENDPOINT_URL=http://localhost:1234/v1 \
54
56
MODEL_ID=lmstudio-community/Qwen3-14B-GGUF \
55
57
npx @huggingface/mcp-client
56
58
You can get more information about mcp-client [here](https://huggingface.co/docs/huggingface.js/en/mcp-client/README).
57
-
59
+
```
58
60
59
61
60
62
## Gradio MCP Server / Tools
@@ -81,6 +83,7 @@ demo = gr.Interface(
81
83
)
82
84
83
85
demo.launch(mcp_server=True)
86
+
```
84
87
85
88
The MCP server will be available at `http://your-server:port/gradio_api/mcp/sse` where your application is served. It will have a tool corresponding to each function in your Gradio app, with the tool description automatically generated from the docstrings of your functions.
0 commit comments