Model response disappears as it's being written and is replaced by error message. On refresh, the whole message can be found. #4346
Replies: 3 comments 1 reply
-
Sometimes I encounter this issue too! While chatting, suddenly only the last message remains on the page. Refreshing it can restore the conversation. |
Beta Was this translation helpful? Give feedback.
-
same, I have checked the code , not found a clue why |
Beta Was this translation helpful? Give feedback.
-
the issue stems from client <> server connection. If you are hosting the instance behind a reverse proxy like NGINX, it has a default connection timeout of 1 minute. This is what's likely happening, especially since you see |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
What happened?
Sometimes, when the model is still writing a response, it disappears completely and is replaced by an error message saying
Refreshing the page actually works, I can later see the message. In the logs, I can see
Considering that I saw it happen with very long messages, it may have happened that the endpoint couldn't finish the message within the given maximum number of tokens and something weird happened.
After that, it is still possible to continue the conversation, but each new message produced by the model erases all the previous ones from the chat. Again, refreshing the page fixes it and shows them.
This happened to me using a LiteLLM endpoint.
Steps to Reproduce
What browsers are you seeing the problem on?
Firefox, Chrome
Relevant log output
Screenshots
No response
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions