-
Notifications
You must be signed in to change notification settings - Fork 7
Open
Description
Some of the failing nightly builds showed there's updated imports with vllm 0.11.1+ that will affect widening or upgrading the compatible vLLM range for this adapter: https://github.com/foundation-model-stack/vllm-detector-adapter/actions/runs/20047013858/job/57494654334
There are at least a couple vLLM PRs of note:
https://github.com/vllm-project/vllm/pull/27188andhttps://github.com/vllm-project/vllm/pull/27567theFlexibleArgumentParserpath has changed andStoreBooleanwas removed. The adapter for now uses those to parse environment variables. One potential workaround here is copying theStoreBooleanclass in, or refactoring the entireLocalEnvVarArgumentParserparser.https://github.com/vllm-project/vllm/pull/26427removedmodel_configfrom the initialization ofOpenAIServingModelsandOpenAIServingChatetc. This will affect the API server and many of the tests in this adapter.
There may be enough breaking changes here to warrant an adapter minor update / non-backwards changes to support the latest vLLM changes.
Metadata
Metadata
Assignees
Labels
No labels