Skip to content

Commit e4750b3

Browse files
committed
Fix LR scheduler behaviour with AMP
1 parent 192e5c5 commit e4750b3

File tree

1 file changed

+4
-0
lines changed

1 file changed

+4
-0
lines changed

src/pytorch_lightning/core/module.py

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1621,6 +1621,10 @@ def lr_scheduler_step(self, scheduler, optimizer_idx, metric):
16211621
scheduler.step(epoch=self.current_epoch)
16221622
16231623
"""
1624+
optimizer = self.trainer.optimizers[optimizer_idx]
1625+
if hasattr(optimizer, '_step_count') and optimizer._step_count <= 0:
1626+
return
1627+
16241628
if metric is None:
16251629
scheduler.step() # type: ignore[call-arg]
16261630
else:

0 commit comments

Comments
 (0)