Skip to content

[Bug]: lora weight 不生效 #4211

@VictoryDong

Description

@VictoryDong

Your current environment

镜像版本: v0.11.0rc0

🐛 Describe the bug

启动命令:vllm serve /path/to/Qwen3-4B-Instruct-2507
--served-model-name Qwen3-4B-Instruct-2507
--port 40900 --gpu-memory-utilization 0.85
--enable-lora
--lora-modules '{"name":"lora1", "path":"/path/to/lora1", "base_model_name":"Qwen3-4B-Instruct-2507"}'

请求 lora1 , 但输出结果为基础模型的结果。 看网上也有其他人遇到了, 问题一直没被解决就关闭了

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions