Skip to content

Commit 850a31a

Browse files
authored
update docs for newly introduced functions (#48)
Signed-off-by: Hao Wu <[email protected]>
1 parent 3cc549f commit 850a31a

File tree

4 files changed

+32
-4
lines changed

4 files changed

+32
-4
lines changed

docs/apidocs/index.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -9,5 +9,7 @@ NeMo Emerging Optimizers API reference provides comprehensive technical document
99
utils.md
1010
orthogonalized-optimizers.md
1111
soap.md
12+
riemannian-optimizers.md
13+
psgd.md
1214
scalar-optimizers.md
1315
```

docs/apidocs/psgd.md

Lines changed: 16 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,16 @@
1+
```{eval-rst}
2+
.. role:: hidden
3+
:class: hidden-section
4+
5+
emerging_optimizers.psgd
6+
========================================
7+
8+
.. automodule:: emerging_optimizers.psgd.procrustes_step
9+
:members:
10+
11+
.. automodule:: emerging_optimizers.psgd.psgd_kron_contractions
12+
:members:
13+
14+
.. automodule:: emerging_optimizers.psgd.psgd_utils
15+
:members:
16+
```
Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,10 @@
1+
```{eval-rst}
2+
.. role:: hidden
3+
:class: hidden-section
4+
5+
emerging_optimizers.riemannian_optimizers
6+
=========================================
7+
8+
.. automodule:: emerging_optimizers.riemannian_optimizers.normalized_optimizer
9+
:members:
10+
```

emerging_optimizers/riemannian_optimizers/normalized_optimizer.py

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -68,9 +68,9 @@ def __init__(
6868
@torch.no_grad() # type: ignore[misc]
6969
def step(self, closure: Callable[[], float] | None = None) -> float | None:
7070
"""Performs a single optimization step.
71+
7172
Args:
72-
closure (callable, optional): A closure that reevaluates the model
73-
and returns the loss.
73+
closure: A closure that reevaluates the model and returns the loss.
7474
"""
7575
loss = closure() if closure is not None else None
7676

@@ -157,9 +157,9 @@ def __init__(
157157
@torch.no_grad() # type: ignore[misc]
158158
def step(self, closure: Callable[[], float] | None = None) -> float | None:
159159
"""Performs a single optimization step.
160+
160161
Args:
161-
closure (callable, optional): A closure that reevaluates the model
162-
and returns the loss.
162+
closure: A closure that reevaluates the model and returns the loss.
163163
"""
164164
loss = closure() if closure is not None else None
165165

0 commit comments

Comments
 (0)