Skip to content

Commit 0df38f5

Browse files
committed
Guard for typing
1 parent bce69ca commit 0df38f5

File tree

1 file changed

+2
-0
lines changed
  • src/lightning/pytorch/plugins/precision

1 file changed

+2
-0
lines changed

src/lightning/pytorch/plugins/precision/fsdp.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -84,6 +84,8 @@ def convert_module(self, module: Module) -> Module:
8484
@override
8585
def clip_grad_by_norm(self, module: Optional[Module], optimizer: Optimizer, clip_val: Union[int, float]) -> None:
8686
# see https://pytorch.org/docs/stable/fsdp.html#torch.distributed.fsdp.FullyShardedDataParallel.clip_grad_norm_
87+
if module is None or not hasattr(module, "clip_grad_norm_") or not isinstance(module.clip_grad_norm_, Callable):
88+
return
8789
module.clip_grad_norm_(clip_val)
8890

8991
@property

0 commit comments

Comments
 (0)