Skip to content

Failed to load model even with eager attention: FlashAttention2 has been toggled on, but it cannot be used #66

@eximius313

Description

@eximius313

I have a fresh install of ComfyUI and installed ComfyUI-VibeVoice according to the provided instructions.

When I hit Run, I get the following error:

Failed to load model even with eager attention: FlashAttention2 has been toggled on, but it cannot be used due to the following error: the package flash_attn seems to be not installed. Please refer to the documentation of https://huggingface.co/docs/transformers/perf_infer_gpu_one#flashattention-2 to install Flash Attention 2.
Image

shouldn’t this be installed automatically as a dependency of ComfyUI-VibeVoice?
It might be worth adding it to the requirements to avoid confusion for new users.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions