Skip to content

FA3 attention backend import error while FA3 not available #592

@DefTruth

Description

@DefTruth
WARNING 12-19 11:47:57 [_attention_dispatch.py:454] Re-registered SAGE attention backend to enable context parallelism with FP8 Attention in cache-dit. You can disable this behavior by: export CACHE_DIT_ENABLE_CUSTOM_ATTN_DISPATCH=0.
INFO 12-19 11:47:57 [_attention_dispatch.py:612] Flash Attention 3 not available, skipping _FLASH_3 backend registration.
[rank0]: Traceback (most recent call last):
[rank0]:   File "/workspace/dev/vipshop/cache-dit/src/cache_dit/parallelism/transformers/native_diffusers/context_parallelism/__init__.py", line 22, in <module>
[rank0]:     _maybe_register_custom_attn_backends()
[rank0]:   File "/workspace/dev/vipshop/cache-dit/src/cache_dit/parallelism/transformers/native_diffusers/context_parallelism/attention/__init__.py", line 14, in _maybe_register_custom_attn_backends
[rank0]:     from ._attention_dispatch import (
[rank0]: ImportError: cannot import name '_flash_attention_3' from 'cache_dit.parallelism.transformers.native_diffusers.context_parallelism.attention._attention_dispatch' (/workspace/dev/vipshop/cache-dit/src/cache_dit/parallelism/transformers/native_diffusers/context_parallelism/attention/_attention_dispatch.py). Did you mean: '_sage_attention'?

[rank0]: During handling of the above exception, another exception occurred:

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions