Skip to content

Commit 9d92a4a

Browse files
committed
Fix LR scheduler behaviour with AMP
1 parent a8605b4 commit 9d92a4a

File tree

1 file changed

+4
-0
lines changed

1 file changed

+4
-0
lines changed

src/pytorch_lightning/core/module.py

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1649,6 +1649,10 @@ def lr_scheduler_step(self, scheduler, optimizer_idx, metric):
16491649
scheduler.step(epoch=self.current_epoch)
16501650
16511651
"""
1652+
optimizer = self.trainer.optimizers[optimizer_idx]
1653+
if hasattr(optimizer, '_step_count') and optimizer._step_count <= 0:
1654+
return
1655+
16521656
if metric is None:
16531657
scheduler.step() # type: ignore[call-arg]
16541658
else:

0 commit comments

Comments
 (0)