Skip to content

Commit c155279

Browse files
committed
docs: Ranger21 docstring
1 parent 7d4c014 commit c155279

File tree

1 file changed

+13
-0
lines changed

1 file changed

+13
-0
lines changed

pytorch_optimizer/optimizer/ranger21.py

Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -15,6 +15,19 @@
1515

1616
class Ranger21(Optimizer, BaseOptimizer):
1717
r"""Integrating the latest deep learning components into a single optimizer.
18+
Here's the components
19+
* uses the AdamW optimizer as its core (or, optionally, MadGrad)
20+
* Adaptive gradient clipping
21+
* Gradient centralization
22+
* Positive-Negative momentum
23+
* Norm loss
24+
* Stable weight decay
25+
* Linear learning rate warm-up
26+
* Explore-exploit learning rate schedule
27+
* Lookahead
28+
* Softplus transformation
29+
* Gradient Normalization
30+
* Corrects the denominator (AdamD)
1831
1932
:param params: PARAMETERS. iterable of parameters to optimize or dicts defining parameter groups.
2033
:param lr: float. learning rate.

0 commit comments

Comments
 (0)