Skip to content

Commit 97b6b0f

Browse files
committed
add docs for toggled_optimizer to LightningModule
1 parent bb63794 commit 97b6b0f

File tree

4 files changed

+5
-2
lines changed

4 files changed

+5
-2
lines changed

docs/source-pytorch/conf.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -487,6 +487,7 @@ def _load_py_module(name: str, location: str) -> ModuleType:
487487
("py:meth", "setup"),
488488
("py:meth", "test_step"),
489489
("py:meth", "toggle_optimizer"),
490+
("py:meth", "toggled_optimizer"),
490491
("py:class", "torch.ScriptModule"),
491492
("py:class", "torch.distributed.fsdp.fully_sharded_data_parallel.CPUOffload"),
492493
("py:class", "torch.distributed.fsdp.fully_sharded_data_parallel.MixedPrecision"),

docs/source-pytorch/model/manual_optimization.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@ To manually optimize, do the following:
1717
* ``optimizer.zero_grad()`` to clear the gradients from the previous training step
1818
* ``self.manual_backward(loss)`` instead of ``loss.backward()``
1919
* ``optimizer.step()`` to update your model parameters
20-
* ``self.toggle_optimizer()`` and ``self.untoggle_optimizer()`` if needed
20+
* ``self.toggle_optimizer()`` and ``self.untoggle_optimizer()``, or ``self.toggled_optimizer()`` if needed
2121

2222
Here is a minimal example of manual optimization.
2323

src/lightning/pytorch/CHANGELOG.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -11,6 +11,8 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
1111

1212
- Add enable_autolog_hparams argument to Trainer ([#20593](https://github.com/Lightning-AI/pytorch-lightning/pull/20593))
1313

14+
- Add `toggled_optimizer(optimizer)` method to the LightningModule, which is a context manager version of `toggle_optimize` and `untoggle_optimizer`
15+
1416

1517
### Changed
1618

src/lightning/pytorch/core/module.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1148,7 +1148,7 @@ def toggled_optimizer(self, optimizer: Union[Optimizer, LightningOptimizer]) ->
11481148
:meth:`untoggle_optimizer` into context manager.
11491149
11501150
Args:
1151-
optimizer: The optimizer to untoggle.
1151+
optimizer: The optimizer to toggle.
11521152
11531153
Example::
11541154

0 commit comments

Comments
 (0)