Skip to content

Commit dee2225

Browse files
authored
Apply suggestions from code review
1 parent 6f04f9c commit dee2225

File tree

1 file changed

+1
-0
lines changed
  • src/lightning/pytorch/plugins/precision

1 file changed

+1
-0
lines changed

src/lightning/pytorch/plugins/precision/fsdp.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -86,6 +86,7 @@ def clip_grad_by_norm(self, module: Optional[Module], optimizer: Optimizer, clip
8686
# see https://pytorch.org/docs/stable/fsdp.html#torch.distributed.fsdp.FullyShardedDataParallel.clip_grad_norm_
8787
if module is None:
8888
return
89+
assert isinstance(module.clip_grad_norm_, Module)
8990
module.clip_grad_norm_(clip_val)
9091

9192
@property

0 commit comments

Comments
 (0)