Replies: 3 comments
-
Hey, is there any update on this? Stuck on this as well. Not seeing predictable behavior when langgraph is used with stream_mode=custom. I hoped the event will immediately be streamed but that is not what's happening. |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
Hey @hinthornw - The question is actually about streaming in custom mode. Messages streaming is working fine but invoking the StreamWriter with some custom data doesn't end up immediately getting streamed. I looked at the internals and it seems like some sort of queue is being used and then at some later point the data is actually getting streamed. For context: The setup is a multi agentic setup and the user needs to be constantly informed about various things that are happening but since the custom data doesn't actually get streamed immediately, there is delays in the UX. The graph is being invoked like this in our code:
|
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi all,
I'm building a langgraph application. It has two nodes (simplifying), A and B. A starts the processing and immediately emits (streams out) a greeting message to the user. It then retrieves some additional information and passes it to node B.
Node B involves an LLM call. It will stream out the output as the LLM emits its output.
My challenge is how to stream out the initial message from A as quickly as possible before node B kicks in. I am able to stream out a custom event using
dispatchCustomEvent
but I need to stream out anAIMessage
.Beta Was this translation helpful? Give feedback.
All reactions