-
Notifications
You must be signed in to change notification settings - Fork 19.6k
fix(langchain): use override in model fallbacks #33716
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
fix(langchain): use override in model fallbacks #33716
Conversation
Reset model_settings when switching to fallback model to prevent provider-specific parameters from causing errors. Fixes langchain-ai#33709
| # Try fallback models | ||
| for fallback_model in self.models: | ||
| request.model = fallback_model | ||
| request.model_settings = {} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you do this using override? request = request.override(model=fallback_model)
|
Taking over the implementation |
|
Thank you so much for the helpful review and for taking over the implementation! |
|
@bart0401 I pushed this to avoid the mutation in place on the model parameter, not the model_settings. We'd need a different solution for model settings, since we can't assume it's safe to clear them |
|
But what if your fallback model is also an Anthropic model? You shouldn't be removing the context-control from it. |
Description:
Clears
model_settingswhenModelFallbackMiddlewareswitches to a fallback model. Previously, provider-specific settings like Anthropic'scache_controlwould persist and cause errors when passed to other providers (e.g., OpenAI throwingTypeError: unexpected keyword argument 'cache_control').Issue: Fixes #33709
Dependencies: None
Testing:
model_settingsare cleared on fallback