Skip to content

Commit e92b73d

Browse files
committed
docs: typo
1 parent 522ca7b commit e92b73d

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

pytorch_optimizer/adamp.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -38,9 +38,9 @@ def __init__(
3838
adamd_debias_term: bool = False,
3939
eps: float = 1e-8,
4040
):
41-
"""
41+
"""AdamP optimizer
4242
:param params: PARAMETERS. iterable of parameters to optimize or dicts defining parameter groups
43-
:param lr: float. learning rate.
43+
:param lr: float. learning rate
4444
:param betas: BETAS. coefficients used for computing running averages of gradient and the squared hessian trace
4545
:param weight_decay: float. weight decay (L2 penalty)
4646
:param delta: float. threshold that determines whether a set of parameters is scale invariant or not

0 commit comments

Comments
 (0)