Skip to content

vLLM 0.11.1+ compatibility #107

@evaline-ju

Description

@evaline-ju

Some of the failing nightly builds showed there's updated imports with vllm 0.11.1+ that will affect widening or upgrading the compatible vLLM range for this adapter: https://github.com/foundation-model-stack/vllm-detector-adapter/actions/runs/20047013858/job/57494654334

There are at least a couple vLLM PRs of note:

  • https://github.com/vllm-project/vllm/pull/27188 and https://github.com/vllm-project/vllm/pull/27567 the FlexibleArgumentParser path has changed and StoreBoolean was removed. The adapter for now uses those to parse environment variables. One potential workaround here is copying the StoreBoolean class in, or refactoring the entire LocalEnvVarArgumentParser parser.
  • https://github.com/vllm-project/vllm/pull/26427 removed model_config from the initialization of OpenAIServingModels and OpenAIServingChat etc. This will affect the API server and many of the tests in this adapter.

There may be enough breaking changes here to warrant an adapter minor update / non-backwards changes to support the latest vLLM changes.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions