Does langgraph support streaming of reasoning content? #3547
-
Hi, I am currently developing some chat bot using langgraph. I am currently studying models with reasoner supports. Like deepseek R1 ,OpenAI o1, ...etc. I knew that the langchain package seems to support the message chunking for deepseek R1's reasoning by detecting the And currently, I can see that langgraph did serve the I wonder if langgraph has the ability to support the streaming of reasoning context just like how deepseek chat did. |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments 2 replies
-
I found it. If the langgraph is set to Here is an example message object {
"event": "messages",
"data": [
{
"content": "",
"additional_kwargs": {
"reasoning_content": "Alright"
},
"response_metadata": {},
"type": "AIMessageChunk",
"name": null,
"id": "run-b8e386dc-4639-4875-a57e-2fa1e9c0acc7",
"example": false,
"tool_calls": [],
"invalid_tool_calls": [],
"usage_metadata": null,
"tool_call_chunks": []
},
{
"created_by": "system",
"thread_name": "Explain rocket science",
"graph_id": "graph",
"assistant_id": "b54c0545-9d79-5688-bc81-269c46247a52",
"run_attempt": 1,
"langgraph_version": "0.2.76",
"langgraph_plan": "developer",
"langgraph_host": "self-hosted",
"llm_provider": "deepseek_r1",
"system_message": "You are a helpful assistant.",
"langgraph_auth_user_id": "",
"checkpoint_id": "1eff4c0a-fb34-690a-8003-1c5850baddfc",
"run_id": "1eff4c21-f9fa-69d1-ac7b-8eba62654de2",
"thread_id": "5754f72e-29c8-439d-9487-ed04f11b2edb",
"user_id": "",
"langgraph_step": 4,
"langgraph_node": "supervisor",
"langgraph_triggers": [
"start:supervisor"
],
"langgraph_path": [
"__pregel_pull",
"supervisor"
],
"langgraph_checkpoint_ns": "supervisor:bedbf9f3-4549-53b7-994b-ba9bc38f1ca2",
"checkpoint_ns": "supervisor:bedbf9f3-4549-53b7-994b-ba9bc38f1ca2",
"ls_provider": "openai",
"ls_model_name": "deepseek-reasoner",
"ls_model_type": "chat",
"ls_temperature": null
}
]
} |
Beta Was this translation helpful? Give feedback.
-
Hi,
Looks like you are using the python sdk. I am only using js sdk for my
project.
But I guess the reasoning_content is located under
`chunk_msg.additional_kwargs.reasoning_content`
Let me know if you need any more info.
Best Regards,
Danny
Jinseok Hong ***@***.***> 於 2025年3月11日週二 下午1:43寫道:
… Hi, did you resolve it?
I cannot get streaming result for reasoning content as below code.
for chunk_msg, metadata in grpah.stream({"question": query}, stream_mode="messages"):
print(chunk_msg.reasoning_content)
the graph contains openai format r1 response
—
Reply to this email directly, view it on GitHub
<#3547 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ANJNBVWIATQHTEKJ3R7WKBT2TZZWJAVCNFSM6AAAAABXSVKQEWVHI2DSMVQWIX3LMV43URDJONRXK43TNFXW4Q3PNVWWK3TUHMYTENBVG42TOOI>
.
You are receiving this because you modified the open/close state.Message
ID: ***@***.***
.com>
|
Beta Was this translation helpful? Give feedback.
-
Well, it's strange. When using the Doubao 1.6thinking model, the reasoning_content can be normally obtained in the return of the OpenAI SDK, but it cannot be found in Langgraph, and it is not in additional_kwargs.
|
Beta Was this translation helpful? Give feedback.
I found it. If the langgraph is set to
message
stream mode, the stream would indeed contain the reasoning content.Here is an example message object