Skip to content

Commit 94f45e2

Browse files
committed
[skip ci] docs: AdaBound optimizer
1 parent 250da98 commit 94f45e2

File tree

1 file changed

+17
-0
lines changed

1 file changed

+17
-0
lines changed

README.md

Lines changed: 17 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -33,6 +33,7 @@ for input, output in data:
3333

3434
| Optimizer | Description | Official Code | Paper |
3535
| :---: | :---: | :---: | :---: |
36+
| AdaBound | *Adaptive Gradient Methods with Dynamic Bound of Learning Rate* | [github](https://github.com/Luolc/AdaBound) | [https://openreview.net/forum?id=Bkg3g2R9FX](https://openreview.net/forum?id=Bkg3g2R9FX) |
3637
| AdaHessian | *An Adaptive Second Order Optimizer for Machine Learning* | [github](https://github.com/amirgholami/adahessian) | [https://arxiv.org/abs/2006.00719](https://arxiv.org/abs/2006.00719) |
3738
| AdamP | *Slowing Down the Slowdown for Momentum Optimizers on Scale-invariant Weights* | [github](https://github.com/clovaai/AdamP) | [https://arxiv.org/abs/2006.08217](https://arxiv.org/abs/2006.08217) |
3839
| MADGRAD | *A Momentumized, Adaptive, Dual Averaged Gradient Method for Stochastic* | [github](https://github.com/facebookresearch/madgrad) | [https://arxiv.org/abs/2101.11075](https://arxiv.org/abs/2101.11075) |
@@ -336,6 +337,22 @@ Acceleration via Fractal Learning Rate Schedules
336337

337338
</details>
338339

340+
<details>
341+
342+
<summary>AdaBound</summary>
343+
344+
```
345+
@inproceedings{Luo2019AdaBound,
346+
author = {Luo, Liangchen and Xiong, Yuanhao and Liu, Yan and Sun, Xu},
347+
title = {Adaptive Gradient Methods with Dynamic Bound of Learning Rate},
348+
booktitle = {Proceedings of the 7th International Conference on Learning Representations},
349+
month = {May},
350+
year = {2019},
351+
address = {New Orleans, Louisiana}
352+
}
353+
```
354+
355+
</details>
339356

340357
## Author
341358

0 commit comments

Comments
 (0)