Skip to content

Commit 2f62a0a

Browse files
committed
Simplify guard
1 parent ed2fe05 commit 2f62a0a

File tree

1 file changed

+1
-1
lines changed
  • src/lightning/pytorch/plugins/precision

1 file changed

+1
-1
lines changed

src/lightning/pytorch/plugins/precision/fsdp.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -84,7 +84,7 @@ def convert_module(self, module: Module) -> Module:
8484
@override
8585
def clip_grad_by_norm(self, module: Optional[Module], optimizer: Optimizer, clip_val: Union[int, float]) -> None:
8686
# see https://pytorch.org/docs/stable/fsdp.html#torch.distributed.fsdp.FullyShardedDataParallel.clip_grad_norm_
87-
if module is None or not hasattr(module, "clip_grad_norm_") or not isinstance(module.clip_grad_norm_, callable):
87+
if module is None:
8888
return
8989
module.clip_grad_norm_(clip_val)
9090

0 commit comments

Comments
 (0)