Received empty response from chat model call. #8521
Unanswered
Rajmathew821
asked this question in
Troubleshooting
Replies: 1 comment
-
If your API does not support streaming, you need to add this to your custom endpoint config: addParams:
disableStreaming: true |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
What happened?
i installed the Librechat Ui via docker container. and i can connect my custom fast api endpoint to librachat through librachat.yaml file. the issue is , i can i select my model in librachar ui page and provide the query send to my end point at the same i able hit my api what was issue is i got the LLM response in my cli only but in librachat ui it was showing the error like i cant got the proper llm response in my libra chat UI page
And my end point is open ai compatable
Version Information
Steps to Reproduce
What browsers are you seeing the problem on?
No response
Relevant log output
Screenshots
No response
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions