How to disable stream of answer and show at once? (faster full response) #2306
Replies: 3 comments
-
I believe this is an issue with OPENAI. They return results in a streaming manner. So just change options on your on server will not change anything |
Beta Was this translation helpful? Give feedback.
-
No. It would take about the same time (if non-stream is faster, it would only be by the milliseconds due to the stream processing) and you would experience a "delay" before the full generation is completed. In almost all cases, streaming is preferred because of the improved experience (getting results in real-time vs. waiting like it's 1999). |
Beta Was this translation helpful? Give feedback.
-
I see. Thanks for answering! |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I have been using librechat with open API key for gpt 4.0 instead chatgpt website. If I am not confusing things, I think that by API I can receive the answer at once instead of waiting for it while it is streamed. Is it true? Or even if I disable the
stream: true
option on the API, the full answer would still take the "usual" time to complete and it would take the same time as if thestream: true
was enabled to get the full answer?Beta Was this translation helpful? Give feedback.
All reactions