Prompt parameters added by defaultParamsEndpoint: 'anthropic' do not work #8933
Replies: 4 comments 7 replies
-
Thanks for providing a config to test with. I think I see the issue here and will resolve it soon! |
Beta Was this translation helpful? Give feedback.
-
@jaceli-stripe just to confirm, when you set these parameters, do you expect the exact payload as it would be for anthropic for thinking/thinking-budget? i.e. https://docs.anthropic.com/en/api/messages#body-thinking "thinking": {
"type": "enabled",
"budget_tokens": 10000
}, |
Beta Was this translation helpful? Give feedback.
-
Just wanted to give you a heads up that we started on this here and looking to implement this weekend: |
Beta Was this translation helpful? Give feedback.
-
Closed by #9415 |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
What happened?
Hi, we are using LiteLLM as a provider for LibreChat. Currently, we have anthropic/claude models on their own custom endpoint:
Due to
defaultParamsEndpoint: 'anthropic'
, we are able to see prompt parameters such asThinking
andThinking Budget
using the models. However, they do not actually modify the request sent and therefore do not work. In particular, togglingThinking
does not meaningfully change the request being sent out and doesn't show the reasoning/chain of thought of the model as expected.Version Information
v0.7.9
Steps to Reproduce
Thinking
andThinking Budget
parametersWhat browsers are you seeing the problem on?
Chrome
Relevant log output
Screenshots
No response
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions