You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
📝 docs(readme): update documentation for LangGraph v0.6.6 integration and model configuration
- Clarify seamless integration with LangGraph Studio and update README badges
- Add LangGraph v0.6.6 features highlighting context-driven configuration API changes
- Provide detailed model configuration methods: runtime context, env vars, Studio assistant
- Expand API key setup instructions for OpenAI, Anthropic, Qwen, and OpenAI-compatible providers
- Introduce sample LangSmith execution traces with DeepWiki and web search examples
- Update MCP server examples to include Context7 documentation tools with usage tips
- Improve Chinese readme with corresponding updates and new configuration guidance
- Refine project descriptions and badge links for both English and Chinese docs
This template showcases a [ReAct agent](https://arxiv.org/abs/2210.03629) implemented using [LangGraph](https://github.com/langchain-ai/langgraph), designed for [LangGraph Studio](https://github.com/langchain-ai/langgraph-studio). ReAct agents are uncomplicated, prototypical agents that can be flexibly extended to many tools.
11
+
This template showcases a [ReAct agent](https://arxiv.org/abs/2210.03629) implemented using [LangGraph](https://github.com/langchain-ai/langgraph), works seamlessly with [LangGraph Studio](https://docs.langchain.com/langgraph-platform/quick-start-studio#use-langgraph-studio). ReAct agents are uncomplicated, prototypical agents that can be flexibly extended to many tools.
12
12
13
13

14
14
@@ -35,6 +35,15 @@ The core logic, defined in `src/react_agent/graph.py`, demonstrates a flexible R
35
35
-**Caching**: Optimized performance with client and tools caching
36
36
-**Configurable**: Enable via environment variables or context parameters
37
37
38
+
### LangGraph v0.6.6 Features
39
+
40
+
> [!NOTE]
41
+
> **New in LangGraph v0.6**: [LangGraph Context](https://docs.langchain.com/oss/python/context#context-overview) replaces the traditional `config['configurable']` pattern. Runtime context is now passed to the `context` argument of `invoke/stream`, providing a cleaner and more intuitive way to configure your agents.
42
+
43
+
-**Context-Driven Configuration**: Runtime context passed via `context` parameter instead of `config['configurable']`
44
+
-**Simplified API**: Cleaner interface for passing runtime configuration to your agents
45
+
-**Backward Compatibility**: Gradual migration path from the old configuration pattern
46
+
38
47
### Comprehensive Testing
39
48
-**70+ Test Cases**: Unit, integration, and end-to-end testing
40
49
-**MCP Integration Coverage**: Full testing of DeepWiki tool loading and execution
@@ -53,6 +62,13 @@ The ReAct agent:
53
62
54
63
The agent comes with web search capabilities and optional DeepWiki MCP documentation tools, but can be easily extended with custom tools to suit various use cases.
55
64
65
+
### Sample Execution Traces
66
+
67
+
See these LangSmith traces to understand how the agent works in practice:
68
+
69
+
-**[DeepWiki Documentation Query](https://smith.langchain.com/public/d0594549-7363-46a7-b1a2-d85b55aaa2bd/r)** - Shows agent using DeepWiki MCP tools to query GitHub repository documentation
70
+
-**[Web Search Query](https://smith.langchain.com/public/6ce92fd2-c0e4-409b-9ce2-02499ae16800/r)** - Demonstrates Tavily web search integration and reasoning loop
71
+
56
72
## Getting Started
57
73
58
74
### Setup with uv (Recommended)
@@ -102,61 +118,41 @@ ENABLE_DEEPWIKI=true
102
118
103
119
The primary [search tool](./src/common/tools.py) uses [Tavily](https://tavily.com/). Create an API key [here](https://app.tavily.com/sign-in).
104
120
105
-
## Model Setup
106
-
107
-
The default model configuration is:
121
+
## Model Configuration
108
122
109
-
```yaml
110
-
model: qwen:qwen-turbo
111
-
```
123
+
The template uses `qwen:qwen-turbo` as the default model, defined in [`src/common/context.py`](./src/common/context.py). You can configure different models in three ways:
112
124
113
-
### OpenAI
125
+
1.**Runtime Context** (recommended for programmatic usage)
126
+
2.**Environment Variables**
127
+
3.**LangGraph Studio Assistant Configuration**
114
128
115
-
To use OpenAI's chat models:
129
+
### API Key Setup by Provider
116
130
117
-
1. Sign up for an [OpenAI API key](https://platform.openai.com/signup)
118
-
2. Add it to your `.env` file:
131
+
#### OpenAI
119
132
```bash
120
-
OPENAI_API_KEY=your-api-key
133
+
OPENAI_API_KEY=your-openai-api-key
121
134
```
135
+
Get your API key: [OpenAI Platform](https://platform.openai.com/signup)
122
136
123
-
### Anthropic
124
-
125
-
To use Anthropic's chat models:
126
-
127
-
1. Sign up for an [Anthropic API key](https://console.anthropic.com/)
128
-
2. Add it to your `.env` file:
137
+
#### Anthropic
129
138
```bash
130
-
ANTHROPIC_API_KEY=your-api-key
139
+
ANTHROPIC_API_KEY=your-anthropic-api-key
131
140
```
132
-
3. Update the model in LangGraph Studio to `anthropic:claude-3-5-sonnet-20240620`
133
-
134
-
### Qwen Models (Default)
141
+
Get your API key: [Anthropic Console](https://console.anthropic.com/)
135
142
136
-
For Alibaba's Qwen models (Qwen3, QwQ-32B, etc.):
137
-
138
-
1. Sign up for a [DashScope API key](https://dashscope.console.aliyun.com/)
139
-
2. Add it to your `.env` file:
143
+
#### Qwen Models (Default)
140
144
```bash
141
-
DASHSCOPE_API_KEY=your-api-key
145
+
DASHSCOPE_API_KEY=your-dashscope-api-key
142
146
REGION=international # or 'prc' for China mainland
143
147
```
144
-
3. Update the model in LangGraph Studio to `qwen:qwen3-32b` or `qwen:qwen-plus`
145
-
146
-
### OpenAI-Compatible Providers
148
+
Get your API key: [DashScope Console](https://dashscope.console.aliyun.com/)
147
149
148
-
For any OpenAI-compatible API (SiliconFlow, Together AI, Groq, etc.):
3. **Enable in Context** - Add context flag and load tools in `get_tools()` function.
195
+
3.**Enable in Context** - Add context flag and load tools in `get_tools()` function:
196
+
```python
197
+
# In src/common/tools.py
198
+
if context.enable_context7:
199
+
tools.extend(await get_context7_tools())
200
+
```
201
+
202
+
> [!TIP]
203
+
> **Context7 Example**: The MCP configuration already includes a commented Context7 server setup. Context7 provides up-to-date library documentation and examples - simply uncomment the configuration and add the context flag to enable it.
204
+
205
+
### Model Configuration Methods
206
+
207
+
#### 1. Runtime Context (Recommended)
208
+
209
+
Use the new LangGraph v0.6.6 context parameter to configure models at runtime:
199
210
200
-
### Configure Models
201
-
Our key extended method `load_chat_model` in [`src/common/utils.py`](./src/common/utils.py) uses LangChain's [`init_chat_model`](https://python.langchain.com/api_reference/langchain/chat_models/langchain.chat_models.base.init_chat_model.html#init-chat-model) as the underlying utility.
211
+
```python
212
+
from common.context import Context
213
+
from react_agent import graph
214
+
215
+
# Configure model via context
216
+
result =await graph.ainvoke(
217
+
{"messages": [("user", "Your question here")]},
218
+
context=Context(model="openai:gpt-4o-mini")
219
+
)
220
+
```
221
+
222
+
#### 2. Environment Variables
223
+
224
+
Set the `MODEL` environment variable in your `.env` file:
225
+
226
+
```bash
227
+
MODEL=anthropic:claude-3.5-haiku
228
+
```
229
+
230
+
#### 3. LangGraph Studio Assistant Configuration
231
+
232
+
In LangGraph Studio, configure models through [Assistant management](https://docs.langchain.com/langgraph-platform/configuration-cloud#manage-assistants). Create or update assistants with different model configurations for easy switching between setups.
0 commit comments