diff --git a/src/labs/deep-agents/built-in-components.mdx b/src/labs/deep-agents/built-in-components.mdx index 1408dbb4..0b01ca78 100644 --- a/src/labs/deep-agents/built-in-components.mdx +++ b/src/labs/deep-agents/built-in-components.mdx @@ -58,3 +58,35 @@ Built-in support for calling specialized sub-agents with the following: - **Tool access control**: Subagents can have different tool sets A general-purpose sub-agent with the same instructions and tools as the main agent is always available. + +## MCP + +The `deepagents` library can be ran with MCP tools. This can be achieved by using the [Langchain MCP Adapter library](https://github.com/langchain-ai/langchain-mcp-adapters). + +For example: + +```python +# pip install langchain-mcp-adapters + +import asyncio +from langchain_mcp_adapters.client import MultiServerMCPClient +from deepagents import create_deep_agent + +async def main(): + # Collect MCP tools + mcp_client = MultiServerMCPClient(...) + mcp_tools = await mcp_client.get_tools() + + # Create agent + agent = create_deep_agent(tools=mcp_tools, ....) + + # Stream the agent + async for chunk in agent.astream( + {"messages": [{"role": "user", "content": "what is langgraph?"}]}, + stream_mode="values" + ): + if "messages" in chunk: + chunk["messages"][-1].pretty_print() + +asyncio.run(main()) +``` diff --git a/src/labs/deep-agents/configuration-options.mdx b/src/labs/deep-agents/configuration-options.mdx index 27a20d91..de705a97 100644 --- a/src/labs/deep-agents/configuration-options.mdx +++ b/src/labs/deep-agents/configuration-options.mdx @@ -89,4 +89,34 @@ const agent = createDeepAgent({ }); ``` - \ No newline at end of file + + +### Custom models + + + +Custom models are only supported in the Python implementation. + + + +By default, `deepagents` uses `"claude-sonnet-4-20250514"`. You can customize this by passing any [LangChain model object](https://python.langchain.com/docs/integrations/chat/). + +For example, use OpenAI's `gpt-oss` model via Ollama: + +```python +# pip install langchain langchain-ollama + +from deepagents import create_deep_agent + +# ... existing agent definitions ... + +model = init_chat_model( + model="ollama:gpt-oss:20b", +) +agent = create_deep_agent( + tools=tools, + instructions=instructions, + model=model, + ... +) +``` \ No newline at end of file