Your current environment
vllm: main
vllm-ascend: main
模型:Qwen2.5-VL-7B-Instruct
Your output of above commands here
🐛 Describe the bug
ValueError: Unknown attention backend: 'ASCEND'. Valid options are: FLASH_ATTN, TRITON_ATTN, XFORMERS, ROCM_ATTN, ROCM_AITER_MLA, ROCM_AITER_FA, TORCH_SDPA, FLASHINFER, FLASHINFER_MLA, TRITON_MLA, CUTLASS_MLA, FLASHMLA, FLASHMLA_SPARSE, FLASH_ATTN_MLA, PALLAS, IPEX, NO_ATTENTION, FLEX_ATTENTION, TREE_ATTN, ROCM_AITER_UNIFIED_ATTN, CPU_ATTN, TORCH_SDPA
(APIServer pid=15833) Traceback (most recent call last):