We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent dc79a7c commit e980289Copy full SHA for e980289
README.rst
@@ -329,7 +329,7 @@ Acceleration via Fractal Learning Rate Schedules.
329
On the Convergence of Adam and Beyond
330
-------------------------------------
331
332
-| Convergence issues can be fixed by endowing such algorithms with `long-term memory' of past gradients
+| Convergence issues can be fixed by endowing such algorithms with 'long-term memory' of past gradients.
333
334
Improved bias-correction in Adam
335
--------------------------------
0 commit comments