Problem with graph.astream()
with stream_mode="message"
#4263
moonseyes
started this conversation in
Discussions
Replies: 2 comments 4 replies
-
which python version are you using? this is likely due to using python < 3.11 |
Beta Was this translation helpful? Give feedback.
4 replies
-
its not just the py version but also langgraph sdk lib versions. I was using langgraph 0.2.x with py 3.11.9 and trying to stream output tokens. Tokens just did not come out and the system was idle. I just wiped every langgraph/langchain/langsmith lib on the proyect and reinstalled with the latest, refactored some code to make it work and now streaming works perfectly with graph.stream method using stream_mode="messages". Hope this info helps :) |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I am having trouble with using
.astream()
withstream_mode="message"
as follow:I want to be able to stream each token directly from llm any time it is invoked in any node, using
.astream()
. However, with the code above, the graph still run with no error, but no chunk is printed.If I change from
astream
tostream
and removeasync
, the chunks are printed as expected formessages
mode.If I keep
astream
and changestream_mode
tovalues
,updates
, the chunks are printed as expected forvalues
andupdates
modes.However, I want to use both
astream
andmessages
mode together.Does anyone run into this problem? Is there a fix to this?
Beta Was this translation helpful? Give feedback.
All reactions