Strange buffering on client when action returns a stream #10473
Replies: 1 comment
-
Closing, it's related to |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi people, I want to show you an interesting issue I’m having.
High level, I've an action which returns a stream. It is created originally from openai-node but I'm not sure how it is relevant. The client side which made the request using
fetch
parse the streamed response.Speaking of code, the action code is this:
The client part is
The strange issue is that messages appears to be different from server to client.
The output of that console.log on messageDelta, server side is:
While the network inspector on the browser side logs these messages:
The issue here is that this inconsistency is causing malformations of the first sentences. The example is in italian, I’m sorry, but the beginning of that sentence should be “Eddy Merkx , noto come”. Instead, the client part prints “Eddy Merckx, notoddy Merckx, noto come”.
I noticed the first message on the client side is longer than the first message on the server side so I’m suspecting some buffering happening somewhere.
Do you people have already encounter this issue? Do you have feedbacks about how to handle this and avoid the malformations?
The Remix version here is 2.5.1. I had a look at the issues and the PRs to find if maybe it is a fixed issue on a later version. In case, do you have suggestions on how I can workaround this without an upgrade?
Thank you
Beta Was this translation helpful? Give feedback.
All reactions