[Enhancement]: Allow user to turn off Thinking parameter for selected Google Gemini models #8411
Replies: 6 comments 5 replies
-
this PR does not solve your enhancement? |
Beta Was this translation helpful? Give feedback.
-
I'm confused as to how to get thinking on my google models, for some reason they by default have no option for thinking, and same thing for openAI models. For openAI models I know i somehow have to use the Responses API but I have not found out how to actually do this, given the lack of documentation. Is there some setting in librechat.yaml that will allow me to turn this on? Currently the google models are enabled through .env file, so im not sure if this is the problem? |
Beta Was this translation helpful? Give feedback.
-
@danny-avila @schnaker85 any comments on my question/request? |
Beta Was this translation helpful? Give feedback.
-
well, I know understand what you are trying to achieve. Just to sum up your request/question better, you configured your Google Endpoints via Env variables:
and you do not use any modelSpec, so you can not set any presets for the specifid models to set the I think in that case this is not yet supported, only with model_specs and preset not globally. Am I right? @danny-avila? |
Beta Was this translation helpful? Give feedback.
-
Did you figure out any workaround? I'm having hard time figuring out how to use these models with LibreChat since I cannot ship it with broken default configuration. |
Beta Was this translation helpful? Give feedback.
-
Ok managed to solve it for my usecase like this:
It will hide the model select so users can only choose presets, and the Gemini 2.0 Flash preset has thinking disabled. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
What features would you like to see added?
By fault, all google Gemini models have Thinking turned on in the Parameters panel, however, not all Gemini models support Thinking, such as gemini-2.0-flash-001, can we do something similar to allow configure drop Thinking parameter globally for certain models, for example, Azure OpenAI has a parameter I used: dropParams but Google endpoint doesn't support this.
More details
allow configure dropParams setting globally for Google endpoint configuration in librechat.yaml or via Env variable
Which components are impacted by your request?
No response
Pictures
No response
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions