GLM-4.5 is not supported. Unable to generate a response after thinking. #8764
Replies: 4 comments 3 replies
-
This sounds like one of those cases where the API call goes through, but LibreChat doesn't know how to handle the response format — especially if the model family (like GLM) isn't using standard OpenAI-compatible returns or needs a custom prompt template. Couple of things worth checking:
I’ve seen a few cases like this — where the model is actually responding, but the frontend just gets stuck on “thinking…” because the parser fails silently. Let me know if you're able to confirm the raw response from the GLM API. Could help isolate whether it’s a frontend template mismatch or an actual backend failure. |
Beta Was this translation helpful? Give feedback.
-
Looks like some models work better than others, no issues with
|
Beta Was this translation helpful? Give feedback.
-
Closed by #8769 |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
What happened?
GLM-4.5 is not supported. Unable to generate a response after thinking.
Version Information
ghcr.io/danny-avila/librechat-dev latest b2acb22fc0ee 10 hours ago 1.16GB
ghcr.io/danny-avila/librechat-rag-api-dev-lite latest 833a22bcf14c 2 weeks ago 1.45GB
Steps to Reproduce
What browsers are you seeing the problem on?
Chrome
Relevant log output
Screenshots
No response
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions