Skip to content

Commit 6fb3a4f

Browse files
committed
update: RAdam
1 parent f327405 commit 6fb3a4f

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

pytorch_optimizer/radam.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -35,13 +35,13 @@ def __init__(
3535
adamd_debias_term: bool = False,
3636
eps: float = 1e-8,
3737
):
38-
"""
38+
"""RAdam
3939
:param params: PARAMETERS. iterable of parameters to optimize or dicts defining parameter groups
40-
:param lr: float. learning rate.
40+
:param lr: float. learning rate
4141
:param betas: BETAS. coefficients used for computing running averages of gradient and the squared hessian trace
4242
:param weight_decay: float. weight decay (L2 penalty)
4343
:param n_sma_threshold: int. (recommended is 5)
44-
:param degenerated_to_sgd: float.
44+
:param degenerated_to_sgd: float. degenerated to SGD
4545
:param adamd_debias_term: bool. Only correct the denominator to avoid inflating step sizes early in training
4646
:param eps: float. term added to the denominator to improve numerical stability
4747
"""

0 commit comments

Comments
 (0)