Skip to content

Commit 8f18aae

Browse files
author
J石页
committed
NPU Adaption for Sanna
1 parent 70cf529 commit 8f18aae

File tree

1 file changed

+0
-5
lines changed

1 file changed

+0
-5
lines changed

src/diffusers/models/attention_processor.py

Lines changed: 0 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -527,11 +527,6 @@ def set_processor(self, processor: "AttnProcessor") -> None:
527527
processor (`AttnProcessor`):
528528
The attention processor to use.
529529
"""
530-
# Set AttnProcessor to NPU if available
531-
if is_torch_npu_available():
532-
if isinstance(processor, AttnProcessor2_0):
533-
processor = AttnProcessorNPU()
534-
535530
# if current processor is in `self._modules` and if passed `processor` is not, we need to
536531
# pop `processor` from `self._modules`
537532
if (

0 commit comments

Comments
 (0)