Skip to content

Commit fedfdd4

Browse files
author
J石页
committed
NPU Adaption for Sanna
1 parent a456fb1 commit fedfdd4

File tree

1 file changed

+0
-6
lines changed

1 file changed

+0
-6
lines changed

src/diffusers/models/attention_processor.py

Lines changed: 0 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -3165,12 +3165,6 @@ def __call__(
31653165
else:
31663166
attention_mask = attention_mask.bool()
31673167

3168-
if attention_mask.dtype != torch.uint8:
3169-
if attention_mask.dtype == torch.bool:
3170-
attention_mask = torch.logical_not(attention_mask.bool())
3171-
else:
3172-
attention_mask = attention_mask.to(torch.uint8)
3173-
31743168
if attn.group_norm is not None:
31753169
hidden_states = attn.group_norm(hidden_states.transpose(1, 2)).transpose(1, 2)
31763170

0 commit comments

Comments
 (0)