Skip to content

Commit 90085d6

Browse files
committed
docs: AdamD optimizer
1 parent e29286f commit 90085d6

File tree

1 file changed

+2
-0
lines changed

1 file changed

+2
-0
lines changed

README.rst

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -58,6 +58,8 @@ Supported Optimizers
5858
+--------------+----------------------------------------------------------------------------------------+-----------------------------------------------------------------------------------+-----------------------------------------------------------------------------------------------+
5959
| AdaHessian | *An Adaptive Second Order Optimizer for Machine Learning* | `github <https://github.com/amirgholami/adahessian>`__ | `https://arxiv.org/abs/2006.00719 <https://arxiv.org/abs/2006.00719>`__ |
6060
+--------------+----------------------------------------------------------------------------------------+-----------------------------------------------------------------------------------+-----------------------------------------------------------------------------------------------+
61+
| AdamD | *Improved bias-correction in Adam* | | `https://arxiv.org/abs/2110.10828 <https://arxiv.org/abs/2110.10828>`__ |
62+
+--------------+----------------------------------------------------------------------------------------+-----------------------------------------------------------------------------------+-----------------------------------------------------------------------------------------------+
6163
| AdamP | *Slowing Down the Slowdown for Momentum Optimizers on Scale-invariant Weights* | `github <https://github.com/clovaai/AdamP>`__ | `https://arxiv.org/abs/2006.08217 <https://arxiv.org/abs/2006.08217>`__ |
6264
+--------------+----------------------------------------------------------------------------------------+-----------------------------------------------------------------------------------+-----------------------------------------------------------------------------------------------+
6365
| diffGrad | *An Optimization Method for Convolutional Neural Networks* | `github <https://github.com/shivram1987/diffGrad>`__ | `https://arxiv.org/abs/1909.11015v3 <https://arxiv.org/abs/1909.11015v3>`__ |

0 commit comments

Comments
 (0)