Skip to content

Commit cae8ed5

Browse files
authored
Merge branch 'master' into auto-fix-rules
2 parents 7f28527 + f58a176 commit cae8ed5

File tree

2 files changed

+10
-3
lines changed

2 files changed

+10
-3
lines changed

docs/source-pytorch/cli/lightning_cli_intermediate_2.rst

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -201,9 +201,10 @@ If the scheduler you want needs other arguments, add them via the CLI (no need t
201201

202202
.. code:: bash
203203
204-
python main.py fit --optimizer=Adam --lr_scheduler=ReduceLROnPlateau --lr_scheduler.monitor=epoch
204+
python main.py fit --optimizer=Adam --lr_scheduler=ReduceLROnPlateau --lr_scheduler.monitor=train_loss
205205
206-
Furthermore, any custom subclass of ``torch.optim.lr_scheduler.LRScheduler`` can be used as learning rate scheduler:
206+
(assuming you have a ``train_loss`` metric logged). Furthermore, any custom subclass of
207+
``torch.optim.lr_scheduler.LRScheduler`` can be used as learning rate scheduler:
207208

208209
.. code:: python
209210
@@ -212,7 +213,6 @@ Furthermore, any custom subclass of ``torch.optim.lr_scheduler.LRScheduler`` can
212213
from lightning.pytorch.cli import LightningCLI
213214
from lightning.pytorch.demos.boring_classes import DemoModel, BoringDataModule
214215
215-
216216
class LitLRScheduler(torch.optim.lr_scheduler.CosineAnnealingLR):
217217
def step(self):
218218
print("", "using LitLRScheduler", "")

src/lightning/pytorch/cli.py

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -66,6 +66,13 @@
6666

6767

6868
class ReduceLROnPlateau(torch.optim.lr_scheduler.ReduceLROnPlateau):
69+
"""Custom ReduceLROnPlateau scheduler that extends PyTorch's ReduceLROnPlateau.
70+
71+
This class adds a `monitor` attribute to the standard PyTorch ReduceLROnPlateau to specify which metric should be
72+
tracked for learning rate adjustment.
73+
74+
"""
75+
6976
def __init__(self, optimizer: Optimizer, monitor: str, *args: Any, **kwargs: Any) -> None:
7077
super().__init__(optimizer, *args, **kwargs)
7178
self.monitor = monitor

0 commit comments

Comments
 (0)