Issue with Showing Thoughts [Thinking] for OpenAI and Gemini Requests via LiteLLM Custom Endpoints in LibreChat #8535
Unanswered
kishore-rajendran-zoomrx
asked this question in
Troubleshooting
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Description:
I am using LibreChat v0.7.9-rc1 (commit f4d97e1) with LiteLLM v1.74.* and custom endpoints to send OpenAI and Gemini requests. I expect to see the thoughts (or thinking) tokens for models that support them, but while the web search functionality via the custom endpoints works fine, I'm unable to get the thoughts to display for these models.
Steps to Reproduce:
Environment:
Additional Information:
Request for Help:
If anyone has encountered this issue or has insights on how to get the model’s thinking/logic to be visible through LiteLLM in LibreChat, I would greatly appreciate any guidance or suggestions.
Beta Was this translation helpful? Give feedback.
All reactions