File tree Expand file tree Collapse file tree 1 file changed +13
-0
lines changed
pytorch_optimizer/optimizer Expand file tree Collapse file tree 1 file changed +13
-0
lines changed Original file line number Diff line number Diff line change 1515
1616class Ranger21 (Optimizer , BaseOptimizer ):
1717 r"""Integrating the latest deep learning components into a single optimizer.
18+ Here's the components
19+ * uses the AdamW optimizer as its core (or, optionally, MadGrad)
20+ * Adaptive gradient clipping
21+ * Gradient centralization
22+ * Positive-Negative momentum
23+ * Norm loss
24+ * Stable weight decay
25+ * Linear learning rate warm-up
26+ * Explore-exploit learning rate schedule
27+ * Lookahead
28+ * Softplus transformation
29+ * Gradient Normalization
30+ * Corrects the denominator (AdamD)
1831
1932 :param params: PARAMETERS. iterable of parameters to optimize or dicts defining parameter groups.
2033 :param lr: float. learning rate.
You can’t perform that action at this time.
0 commit comments