You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: units/en/unit2/gradio-client.mdx
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -39,9 +39,10 @@ tools = mcp_client.get_tools()
39
39
```
40
40
41
41
Now that we have the tools, we can create a simple agent that uses them to answer questions. We'll just use a simple `InferenceClientModel` and the default model from `smolagents` for now.
42
+
It is important to pass your api_key to the InferenceClientModel. You can access the token from your huggingface account. [check here.](https://huggingface.co/docs/hub/en/security-tokens).
42
43
43
44
```python
44
-
model = InferenceClientModel()
45
+
model = InferenceClientModel(api_key=YOUR_API_TOKEN)
45
46
agent = CodeAgent(tools=[*tools], model=model)
46
47
```
47
48
@@ -53,8 +54,7 @@ demo = gr.ChatInterface(
53
54
type="messages",
54
55
examples=["Prime factorization of 68"],
55
56
title="Agent with MCP Tools",
56
-
description="This is a simple agent that uses MCP tools to answer questions.",
57
-
messages=[],
57
+
description="This is a simple agent that uses MCP tools to answer questions."
0 commit comments