You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+3-1Lines changed: 3 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -10,7 +10,7 @@
10
10
11
11
**pytorch-optimizer** is optimizer & lr scheduler collections in PyTorch.
12
12
I just re-implemented (speed & memory tweaks, plug-ins) the algorithm while based on the original paper. Also, It includes useful and practical optimization ideas.
13
-
Currently, **79 optimizers (+ `bitsandbytes`, `qgalore`, `torchao`)**, **16 lr schedulers**, and **13 loss functions** are supported!
13
+
Currently, **80 optimizers (+ `bitsandbytes`, `qgalore`, `torchao`)**, **16 lr schedulers**, and **13 loss functions** are supported!
14
14
15
15
Highly inspired by [pytorch-optimizer](https://github.com/jettify/pytorch-optimizer).
| ADOPT |*Modified Adam Can Converge with Any β2 with the Optimal Rate*|[github](https://github.com/iShohei220/adopt)|<https://arxiv.org/abs/2411.02853>|[cite](https://github.com/iShohei220/adopt?tab=readme-ov-file#citation)|
185
185
| FTRL |*Follow The Regularized Leader*||<https://static.googleusercontent.com/media/research.google.com/en//pubs/archive/41159.pdf>||
186
186
| Cautious |*Improving Training with One Line of Code*|[github](https://github.com/kyleliang919/C-Optim)|<https://arxiv.org/pdf/2411.16085v1>|[cite](https://github.com/kyleliang919/C-Optim?tab=readme-ov-file#citation)|
| MicroAdam |*Accurate Adaptive Optimization with Low Space Overhead and Provable Convergence*|[github](https://github.com/IST-DASLab/MicroAdam)|<https://arxiv.org/abs/2405.15593>|[cite](https://github.com/IST-DASLab/MicroAdam?tab=readme-ov-file#citing)|
Copy file name to clipboardExpand all lines: docs/index.md
+3-1Lines changed: 3 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -10,7 +10,7 @@
10
10
11
11
**pytorch-optimizer** is optimizer & lr scheduler collections in PyTorch.
12
12
I just re-implemented (speed & memory tweaks, plug-ins) the algorithm while based on the original paper. Also, It includes useful and practical optimization ideas.
13
-
Currently, **79 optimizers (+ `bitsandbytes`, `qgalore`, `torchao`)**, **16 lr schedulers**, and **13 loss functions** are supported!
13
+
Currently, **80 optimizers (+ `bitsandbytes`, `qgalore`, `torchao`)**, **16 lr schedulers**, and **13 loss functions** are supported!
14
14
15
15
Highly inspired by [pytorch-optimizer](https://github.com/jettify/pytorch-optimizer).
| ADOPT |*Modified Adam Can Converge with Any β2 with the Optimal Rate*|[github](https://github.com/iShohei220/adopt)|<https://arxiv.org/abs/2411.02853>|[cite](https://github.com/iShohei220/adopt?tab=readme-ov-file#citation)|
185
185
| FTRL |*Follow The Regularized Leader*||<https://static.googleusercontent.com/media/research.google.com/en//pubs/archive/41159.pdf>||
186
186
| Cautious |*Improving Training with One Line of Code*|[github](https://github.com/kyleliang919/C-Optim)|<https://arxiv.org/pdf/2411.16085v1>|[cite](https://github.com/kyleliang919/C-Optim?tab=readme-ov-file#citation)|
| MicroAdam |*Accurate Adaptive Optimization with Low Space Overhead and Provable Convergence*|[github](https://github.com/IST-DASLab/MicroAdam)|<https://arxiv.org/abs/2405.15593>|[cite](https://github.com/IST-DASLab/MicroAdam?tab=readme-ov-file#citing)|
0 commit comments