Skip to content

Commit f6aa9ab

Browse files
Palashiolnhsingh
andauthored
docs: add mcp (#151)
Co-authored-by: Lauren Hirata Singh <[email protected]>
1 parent 46b69ae commit f6aa9ab

File tree

2 files changed

+63
-1
lines changed

2 files changed

+63
-1
lines changed

src/labs/deep-agents/built-in-components.mdx

Lines changed: 32 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -58,3 +58,35 @@ Built-in support for calling specialized sub-agents with the following:
5858
- **Tool access control**: Subagents can have different tool sets
5959

6060
A general-purpose sub-agent with the same instructions and tools as the main agent is always available.
61+
62+
## MCP
63+
64+
The `deepagents` library can be ran with MCP tools. This can be achieved by using the [Langchain MCP Adapter library](https://github.com/langchain-ai/langchain-mcp-adapters).
65+
66+
For example:
67+
68+
```python
69+
# pip install langchain-mcp-adapters
70+
71+
import asyncio
72+
from langchain_mcp_adapters.client import MultiServerMCPClient
73+
from deepagents import create_deep_agent
74+
75+
async def main():
76+
# Collect MCP tools
77+
mcp_client = MultiServerMCPClient(...)
78+
mcp_tools = await mcp_client.get_tools()
79+
80+
# Create agent
81+
agent = create_deep_agent(tools=mcp_tools, ....)
82+
83+
# Stream the agent
84+
async for chunk in agent.astream(
85+
{"messages": [{"role": "user", "content": "what is langgraph?"}]},
86+
stream_mode="values"
87+
):
88+
if "messages" in chunk:
89+
chunk["messages"][-1].pretty_print()
90+
91+
asyncio.run(main())
92+
```

src/labs/deep-agents/configuration-options.mdx

Lines changed: 31 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -89,4 +89,34 @@ const agent = createDeepAgent({
8989
});
9090
```
9191

92-
</CodeGroup>
92+
</CodeGroup>
93+
94+
### Custom models
95+
96+
<Note>
97+
98+
Custom models are only supported in the Python implementation.
99+
100+
</Note>
101+
102+
By default, `deepagents` uses `"claude-sonnet-4-20250514"`. You can customize this by passing any [LangChain model object](https://python.langchain.com/docs/integrations/chat/).
103+
104+
For example, use OpenAI's `gpt-oss` model via Ollama:
105+
106+
```python
107+
# pip install langchain langchain-ollama
108+
109+
from deepagents import create_deep_agent
110+
111+
# ... existing agent definitions ...
112+
113+
model = init_chat_model(
114+
model="ollama:gpt-oss:20b",
115+
)
116+
agent = create_deep_agent(
117+
tools=tools,
118+
instructions=instructions,
119+
model=model,
120+
...
121+
)
122+
```

0 commit comments

Comments
 (0)