Skip to content

Commit f763087

Browse files
authored
Fix LR scheduler behaviour with AMP
1 parent f9ae89f commit f763087

File tree

1 file changed

+4
-0
lines changed

1 file changed

+4
-0
lines changed

src/pytorch_lightning/core/module.py

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1585,6 +1585,10 @@ def lr_scheduler_step(self, scheduler, optimizer_idx, metric):
15851585
scheduler.step(epoch=self.current_epoch)
15861586
15871587
"""
1588+
optimizer = self.trainer.optimizers[optimizer_idx]
1589+
if hasattr(optimizer, '_step_count') and optimizer._step_count <= 0:
1590+
return
1591+
15881592
if metric is None:
15891593
scheduler.step() # type: ignore[call-arg]
15901594
else:

0 commit comments

Comments
 (0)