Skip to content

litellm.drop_params error when running the openapi server #63

@masaruduy

Description

@masaruduy

I'm running the server normally with: python -m routellm.openai_server --routers mf --weak-model ollama_chat/codeqwen
and am getting this whenever I attempt a prompt:

File "C:\Users\nate\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\litellm\utils.py", line 3586, in get_optional_params
_check_valid_arg(supported_params=supported_params)
File "C:\Users\nate\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\litellm\utils.py", line 3060, in _check_valid_arg
raise UnsupportedParamsError(
litellm.exceptions.UnsupportedParamsError: litellm.UnsupportedParamsError: ollama_chat does not support parameters: {'presence_penalty': 0.0}, for model=codeqwen. To drop these, set litellm.drop_params=True or for proxy:

litellm_settings: drop_params: true

I've tried modifying the yaml to no avail.
Please help!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions