You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
📝 docs(readme): update documentation for LangGraph v0.6.6 integration and model configuration
- Clarify seamless integration with LangGraph Studio and update README badges
- Add LangGraph v0.6.6 features highlighting context-driven configuration API changes
- Provide detailed model configuration methods: runtime context, env vars, Studio assistant
- Expand API key setup instructions for OpenAI, Anthropic, Qwen, and OpenAI-compatible providers
- Introduce sample LangSmith execution traces with DeepWiki and web search examples
- Update MCP server examples to include Context7 documentation tools with usage tips
- Improve Chinese readme with corresponding updates and new configuration guidance
- Refine project descriptions and badge links for both English and Chinese docs
This template showcases a [ReAct agent](https://arxiv.org/abs/2210.03629) implemented using [LangGraph](https://github.com/langchain-ai/langgraph), designed for [LangGraph Studio](https://github.com/langchain-ai/langgraph-studio). ReAct agents are uncomplicated, prototypical agents that can be flexibly extended to many tools.
11
+
This template showcases a [ReAct agent](https://arxiv.org/abs/2210.03629) implemented using [LangGraph](https://github.com/langchain-ai/langgraph), works seamlessly with [LangGraph Studio](https://docs.langchain.com/langgraph-platform/quick-start-studio#use-langgraph-studio). ReAct agents are uncomplicated, prototypical agents that can be flexibly extended to many tools.
12
12
13
13

14
14
@@ -18,28 +18,30 @@ The core logic, defined in `src/react_agent/graph.py`, demonstrates a flexible R
-**Documentation Tools**: DeepWiki MCP integration for GitHub repository documentation
24
-
-**Extensible**: Easy to add custom tools via `src/common/tools.py`
25
-
26
21
### Multi-Provider Model Support
22
+
-**Qwen Models**: Complete Qwen series support via `langchain-qwq` package, including Qwen-Plus, Qwen-Turbo, QwQ-32B, QvQ-72B
27
23
-**OpenAI**: GPT-4o, GPT-4o-mini, etc.
28
24
-**OpenAI-Compatible**: Any provider supporting OpenAI API format via custom API key and base URL
29
25
-**Anthropic**: Claude 4 Sonnet, Claude 3.5 Haiku, etc.
30
-
-**Qwen**: Qwen-Plus, Qwen-Turbo, QwQ-32B, QvQ-72B with regional API support
31
26
32
-
### MCP Integration
33
-
-**Model Context Protocol**: Dynamic external tool loading at runtime
34
-
-**DeepWiki MCP Server**: Repository documentation access and Q&A capabilities
35
-
-**Caching**: Optimized performance with client and tools caching
36
-
-**Configurable**: Enable via environment variables or context parameters
27
+
### Agent Tool Integration Ecosystem
28
+
-**Model Context Protocol (MCP)**: Dynamic external tool loading at runtime
29
+
-**DeepWiki MCP Server**: Optional MCP tools for GitHub repository documentation access and Q&A capabilities
30
+
-**Web Search**: Built-in traditional LangChain tools (Tavily) for internet information retrieval
31
+
32
+
### LangGraph v0.6 Features
33
+
34
+
> [!NOTE]
35
+
> **New in LangGraph v0.6**: [LangGraph Context](https://docs.langchain.com/oss/python/context#context-overview) replaces the traditional `config['configurable']` pattern. Runtime context is now passed to the `context` argument of `invoke/stream`, providing a cleaner and more intuitive way to configure your agents.
37
36
38
-
### Comprehensive Testing
39
-
-**70+ Test Cases**: Unit, integration, and end-to-end testing
40
-
-**MCP Integration Coverage**: Full testing of DeepWiki tool loading and execution
The agent comes with web search capabilities and optional DeepWiki MCP documentation tools, but can be easily extended with custom tools to suit various use cases.
55
57
58
+
### Sample Execution Traces
59
+
60
+
See these LangSmith traces to understand how the agent works in practice:
61
+
62
+
-**[DeepWiki Documentation Query](https://smith.langchain.com/public/d0594549-7363-46a7-b1a2-d85b55aaa2bd/r)** - Shows agent using DeepWiki MCP tools to query GitHub repository documentation
63
+
-**[Web Search Query](https://smith.langchain.com/public/6ce92fd2-c0e4-409b-9ce2-02499ae16800/r)** - Demonstrates Tavily web search integration and reasoning loop
64
+
56
65
## Getting Started
57
66
58
67
### Setup with uv (Recommended)
@@ -88,75 +97,61 @@ cp .env.example .env
88
97
# Required: Web search functionality
89
98
TAVILY_API_KEY=your-tavily-api-key
90
99
91
-
# Model providers (choose at least one)
100
+
# Required: If using Qwen models (default)
101
+
DASHSCOPE_API_KEY=your-dashscope-api-key
102
+
103
+
# Optional: OpenAI model service platform keys
92
104
OPENAI_API_KEY=your-openai-api-key
105
+
# Optional: If using OpenAI-compatible service platforms
106
+
OPENAI_API_BASE=your-openai-base-url
107
+
108
+
# Optional: If using Anthropic models
93
109
ANTHROPIC_API_KEY=your-anthropic-api-key
94
-
DASHSCOPE_API_KEY=your-dashscope-api-key # For Qwen models
95
110
96
-
# Optional: Regional API support for Qwen models
97
-
REGION=international # or 'prc' for China mainland
111
+
# Optional: Regional API support for Qwen models
112
+
REGION=international # or 'prc' for China mainland (default)
The primary [search tool](./src/common/tools.py) uses [Tavily](https://tavily.com/). Create an API key [here](https://app.tavily.com/sign-in).
104
119
105
-
## Model Setup
106
-
107
-
The default model configuration is:
120
+
## Model Configuration
108
121
109
-
```yaml
110
-
model: qwen:qwen-turbo
111
-
```
122
+
The template uses `qwen:qwen-turbo` as the default model, defined in [`src/common/context.py`](./src/common/context.py). You can configure different models in three ways:
112
123
113
-
### OpenAI
124
+
1.**Runtime Context** (recommended for programmatic usage)
125
+
2.**Environment Variables**
126
+
3.**LangGraph Studio Assistant Configuration**
114
127
115
-
To use OpenAI's chat models:
128
+
### API Key Setup by Provider
116
129
117
-
1. Sign up for an [OpenAI API key](https://platform.openai.com/signup)
118
-
2. Add it to your `.env` file:
130
+
#### OpenAI
119
131
```bash
120
-
OPENAI_API_KEY=your-api-key
132
+
OPENAI_API_KEY=your-openai-api-key
121
133
```
134
+
Get your API key: [OpenAI Platform](https://platform.openai.com/api-keys)
122
135
123
-
### Anthropic
124
-
125
-
To use Anthropic's chat models:
126
-
127
-
1. Sign up for an [Anthropic API key](https://console.anthropic.com/)
128
-
2. Add it to your `.env` file:
136
+
#### Anthropic
129
137
```bash
130
-
ANTHROPIC_API_KEY=your-api-key
138
+
ANTHROPIC_API_KEY=your-anthropic-api-key
131
139
```
132
-
3. Update the model in LangGraph Studio to `anthropic:claude-3-5-sonnet-20240620`
133
-
134
-
### Qwen Models (Default)
140
+
Get your API key: [Anthropic Console](https://console.anthropic.com/)
135
141
136
-
For Alibaba's Qwen models (Qwen3, QwQ-32B, etc.):
137
-
138
-
1. Sign up for a [DashScope API key](https://dashscope.console.aliyun.com/)
139
-
2. Add it to your `.env` file:
142
+
#### Qwen Models (Default)
140
143
```bash
141
-
DASHSCOPE_API_KEY=your-api-key
144
+
DASHSCOPE_API_KEY=your-dashscope-api-key
142
145
REGION=international # or 'prc' for China mainland
143
146
```
144
-
3. Update the model in LangGraph Studio to `qwen:qwen3-32b` or `qwen:qwen-plus`
145
-
146
-
### OpenAI-Compatible Providers
147
-
148
-
For any OpenAI-compatible API (SiliconFlow, Together AI, Groq, etc.):
147
+
Get your API key: [DashScope Console](https://dashscope.console.aliyun.com/)
3. **Enable in Context** - Add context flag and load tools in `get_tools()` function.
194
+
3.**Enable in Context** - Add context flag and load tools in `get_tools()` function:
195
+
```python
196
+
# In src/common/tools.py
197
+
if context.enable_context7:
198
+
tools.extend(await get_context7_tools())
199
+
```
199
200
200
-
### Configure Models
201
-
Our key extended method `load_chat_model` in [`src/common/utils.py`](./src/common/utils.py) uses LangChain's [`init_chat_model`](https://python.langchain.com/api_reference/langchain/chat_models/langchain.chat_models.base.init_chat_model.html#init-chat-model) as the underlying utility.
201
+
> [!TIP]
202
+
> **Context7 Example**: The MCP configuration already includes a commented Context7 server setup. Context7 provides up-to-date library documentation and examples - simply uncomment the configuration and add the context flag to enable it.
model = "qwen:qvq-72b-preview" # Multimodal reasoning
208
+
Use the new LangGraph v0.6 context parameter to configure models at runtime:
216
209
217
-
# Anthropic models
218
-
model = "anthropic:claude-4-sonnet"
219
-
model = "anthropic:claude-3.5-haiku"
210
+
```python
211
+
from common.context import Context
212
+
from react_agent import graph
213
+
214
+
# Configure model via context
215
+
result =await graph.ainvoke(
216
+
{"messages": [("user", "Your question here")]},
217
+
context=Context(model="openai:gpt-4o-mini")
218
+
)
220
219
```
221
220
222
-
**Configuration**:
221
+
#### 2. Environment Variables
222
+
223
+
Set the `MODEL` environment variable in your `.env` file:
224
+
223
225
```bash
224
-
# Set via environment variable
225
-
MODEL=qwen:qwen-turbo
226
+
MODEL=anthropic:claude-3.5-haiku
227
+
```
226
228
227
-
# Or in LangGraph Studio configurable settings
229
+
#### 3. LangGraph Studio Assistant Configuration
230
+
231
+
In LangGraph Studio, configure models through [Assistant management](https://docs.langchain.com/langgraph-platform/configuration-cloud#manage-assistants). Create or update assistants with different model configurations for easy switching between setups.
0 commit comments