Skip to content

Commit 71da75a

Browse files
committed
docs: DAdaptAdam docstring
1 parent 59526a2 commit 71da75a

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

pytorch_optimizer/optimizer/dadapt.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -246,7 +246,7 @@ def step(self, closure: CLOSURE = None) -> LOSS:
246246

247247

248248
class DAdaptAdam(Optimizer, BaseOptimizer):
249-
r"""Adam with D-Adaptation. Leave LR set to 1 unless you encounter instability.
249+
r"""Adam with D-Adaptation. Leave LR set to 1 unless you encounter instability. This implementation is based on V3.
250250
251251
:param params: PARAMETERS. iterable of parameters to optimize or dicts defining parameter groups.
252252
:param lr: float. learning rate.

0 commit comments

Comments
 (0)