Skip to content

Commit 0057cd1

Browse files
authored
Update adam_adamax.md
1 parent e5650d8 commit 0057cd1

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

docs/src/algo/adam_adamax.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
# Adam and AdaMax
2-
This page contains information about Adam and AdaMax.
2+
This page contains information about Adam and AdaMax. Notice, that these algorithms do not use line search algorithms, so some tuning of `alpha` may be necessary to obtain sufficiently fast convergence on your specific problem.
33
## Constructors
44
```julia
55
Adam(; alpha=0.0001,
@@ -19,4 +19,4 @@ AdaMax(; alpha=0.002,
1919
where `alpha` is the step length or learning parameter. `beta_mean` and `beta_var` are exponential decay parameters for the first and second moments estimates. Setting these closer to 0 will cause past iterates to matter less for the current steps and setting them closer to 1 means emphasizing past iterates more.
2020

2121
## References
22-
Kingma, Diederik P., and Jimmy Ba. "Adam: A method for stochastic optimization." arXiv preprint arXiv:1412.6980 (2014).
22+
Kingma, Diederik P., and Jimmy Ba. "Adam: A method for stochastic optimization." arXiv preprint arXiv:1412.6980 (2014).

0 commit comments

Comments
 (0)