You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
fix: watsonx and litellm parameter filtering (#187)
* fix: watsonx param filter
* fix: litellm model options filtering and tests
* fix: change conftest to skip instead of fail qual tests on github
* fix: remove comment
* fix: test defaults
* test: fixes to litellm test
* test:fix test defaults
# Since LiteLLM has many different providers, we add some additional parameter logging here.
205
+
# There's two sets of parameters we have to look at:
206
+
# - unsupported_openai_params: standard OpenAI parameters that LiteLLM will automatically drop for us when `drop_params=True` if the provider doesn't support them.
207
+
# - unknown_keys: parameters that LiteLLM doesn't know about, aren't standard OpenAI parameters, and might be used by the provider. We don't drop these.
unknown_keys= [] # Keys that are unknown to litellm.
211
+
unsupported_openai_params= [] # OpenAI params that are known to litellm but not supported for this model/provider.
197
212
forkeyinbackend_specific.keys():
198
213
ifkeynotinsupported_params:
199
-
unsupported_openai_params.append(key)
200
-
201
-
# if len(unknown_keys) > 0:
202
-
# FancyLogger.get_logger().warning(
203
-
# f"litellm allows for unknown / non-openai input params; mellea won't validate the following params that may cause issues: {', '.join(unknown_keys)}"
204
-
# )
214
+
ifkeyinstandard_openai_subset:
215
+
# LiteLLM is pretty confident that this standard OpenAI parameter won't work.
216
+
unsupported_openai_params.append(key)
217
+
else:
218
+
# LiteLLM doesn't make any claims about this parameter; we won't drop it but we will keep track of it..
219
+
unknown_keys.append(key)
220
+
221
+
iflen(unknown_keys) >0:
222
+
FancyLogger.get_logger().warning(
223
+
f"litellm allows for unknown / non-openai input params; mellea won't validate the following params that may cause issues: {', '.join(unknown_keys)}"
224
+
)
205
225
206
226
iflen(unsupported_openai_params) >0:
207
227
FancyLogger.get_logger().warning(
208
-
f"litellm will automatically drop the following openai keys that aren't supported by the current model/provider: {', '.join(unsupported_openai_params)}"
228
+
f"litellm may drop the following openai keys that it doesn't seem to recognize as being supported by the current model/provider: {', '.join(unsupported_openai_params)}"
0 commit comments