Skip to content

Commit 82aeca0

Browse files
[pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
1 parent b704e6c commit 82aeca0

File tree

3 files changed

+4
-2
lines changed

3 files changed

+4
-2
lines changed

colossalai/pipeline/schedule/interleaved_pp.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@
66
from torch.nn import Module, ModuleList
77
from torch.utils._pytree import tree_map
88

9-
from colossalai.accelerator import get_accelerator, BaseAccelerator
9+
from colossalai.accelerator import BaseAccelerator, get_accelerator
1010
from colossalai.interface import OptimizerWrapper
1111
from colossalai.pipeline.p2p import PipelineP2PCommunication, create_send_metadata
1212
from colossalai.pipeline.stage_manager import PipelineStageManager

colossalai/shardformer/layer/normalization.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -17,6 +17,7 @@
1717
SUPPORT_NPU = False
1818
try:
1919
import torch_npu
20+
2021
SUPPORT_NPU = True
2122
except Exception:
2223
pass

extensions/pybind/flash_attention/flash_attention_npu.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,7 @@
1-
from ...base_extension import _Extension
21
import math
32

3+
from ...base_extension import _Extension
4+
45

56
class FlashAttentionNpuExtension(_Extension):
67
def __init__(self):

0 commit comments

Comments
 (0)