Skip to content

Commit e35dd67

Browse files
entrpnyiyixuxu
andauthored
Update src/diffusers/models/attention_processor.py
Co-authored-by: YiYi Xu <[email protected]>
1 parent 2d7c198 commit e35dd67

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

src/diffusers/models/attention_processor.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -316,7 +316,7 @@ def set_use_xla_flash_attention(
316316
elif is_spmd() and is_torch_xla_version("<", "2.4"):
317317
raise "flash attention pallas kernel using SPMD is supported from torch_xla version 2.4"
318318
else:
319-
if len(kwargs) > 0 and kwargs.get("is_flux", None):
319+
if is_flux:
320320
processor = XLAFluxFlashAttnProcessor2_0(partition_spec)
321321
else:
322322
processor = XLAFlashAttnProcessor2_0(partition_spec)

0 commit comments

Comments
 (0)