First message to custom endpoint does not retrieve final chunk #4846
Replies: 4 comments 12 replies
-
I now saw that when the end of the first response occurs, there is an error in the librechat log file:
On the next responses in the same conversation there is no such error. This is the python code that returns the response into chunks: `async def _resp_async_generator(text_resp: str):
|
Beta Was this translation helpful? Give feedback.
-
Try testing it with the |
Beta Was this translation helpful? Give feedback.
-
did you guys solve it? |
Beta Was this translation helpful? Give feedback.
-
What backend/middle are you using? I cannot replicate this on Llamacpp, exllamav2 or Aphrodite. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
What happened?
I use a custom API integrated in libreChat. For using it I have configured a custom Endpoint. When I start a conversation with the my API, the first assistant prompt (the first response) hangs on the UI and I have to click the "stop" button to continue. All other responses are fine, only the first one hangs.
In the network tab I see that the first response does not have the final chunk received.
I debugged my API and I am sure that my API returns the last chunk (data:[DONE])
Steps to Reproduce
What browsers are you seeing the problem on?
No response
Relevant log output
No response
Screenshots
After I click the stop button next to the prompt field, the conversation continues normally for all other messages:
Screenshots of the Network tab: the first message does not receive the last part with chunk marked as finel


After the first message, all others continue normally and receive their final = true message
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions