Skip to content

Commit eea0a94

Browse files
authored
Update fsdp.py
1 parent 161241e commit eea0a94

File tree

1 file changed

+1
-1
lines changed
  • src/lightning/pytorch/plugins/precision

1 file changed

+1
-1
lines changed

src/lightning/pytorch/plugins/precision/fsdp.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -82,7 +82,7 @@ def convert_module(self, module: Module) -> Module:
8282
return module
8383

8484
@override
85-
def clip_grad_by_norm(self, module: Optional[Module], optimizer: Optimizer, clip_val: Union[int, float]) -> None:
85+
def clip_grad_by_norm(self, optimizer: Optimizer, clip_val: Union[int, float], module: Optional[Module] = None) -> None:
8686
# see https://pytorch.org/docs/stable/fsdp.html#torch.distributed.fsdp.FullyShardedDataParallel.clip_grad_norm_
8787
if module is None:
8888
return

0 commit comments

Comments
 (0)