Skip to content

Commit 58f923f

Browse files
authored
Merge pull request #265 from kozistr/feature/adamg-optimizer
[Feature] Implement AdamG optimizer
2 parents 1054960 + d728f9e commit 58f923f

File tree

13 files changed

+307
-201
lines changed

13 files changed

+307
-201
lines changed

README.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@
1010

1111
**pytorch-optimizer** is optimizer & lr scheduler collections in PyTorch.
1212
I just re-implemented (speed & memory tweaks, plug-ins) the algorithm while based on the original paper. Also, It includes useful and practical optimization ideas.
13-
Currently, **74 optimizers (+ `bitsandbytes`, `qgalore`)**, **16 lr schedulers**, and **13 loss functions** are supported!
13+
Currently, **75 optimizers (+ `bitsandbytes`, `qgalore`)**, **16 lr schedulers**, and **13 loss functions** are supported!
1414

1515
Highly inspired by [pytorch-optimizer](https://github.com/jettify/pytorch-optimizer).
1616

@@ -172,6 +172,7 @@ supported_optimizers = get_supported_optimizers()
172172
| StableAdamW | *Stable and low-precision training for large-scale vision-language models* | | <https://arxiv.org/abs/2304.13013> | [cite](https://ui.adsabs.harvard.edu/abs/2023arXiv230413013W/exportcitation) |
173173
| AdamMini | *Use Fewer Learning Rates To Gain More* | [github](https://github.com/zyushun/Adam-mini) | <https://arxiv.org/abs/2406.16793> | [cite](https://github.com/zyushun/Adam-mini?tab=readme-ov-file#citation) |
174174
| TRAC | *Adaptive Parameter-free Optimization* | [github](https://github.com/ComputationalRobotics/TRAC) | <https://arxiv.org/abs/2405.16642> | [cite](https://ui.adsabs.harvard.edu/abs/2024arXiv240516642M/exportcitation) |
175+
| AdamG | *Towards Stability of Parameter-free Optimization* | | <https://arxiv.org/abs/2405.04376> | [cite](https://ui.adsabs.harvard.edu/abs/2024arXiv240504376P/exportcitation) |
175176

176177
## Supported LR Scheduler
177178

docs/changelogs/v3.1.1.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -5,6 +5,8 @@
55
* Implement `TRAC` optimizer. (#263)
66
* [Fast TRAC: A Parameter-Free Optimizer for Lifelong Reinforcement Learning](https://arxiv.org/abs/2405.16642)
77
* Support `AdamW` optimizer via `create_optimizer()`. (#263)
8+
* Implement `AdamG` optimizer. (#264, #265)
9+
* [Towards Stability of Parameter-free Optimization](https://arxiv.org/abs/2405.04376)
810

911
### Bug
1012

docs/index.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@
1010

1111
**pytorch-optimizer** is optimizer & lr scheduler collections in PyTorch.
1212
I just re-implemented (speed & memory tweaks, plug-ins) the algorithm while based on the original paper. Also, It includes useful and practical optimization ideas.
13-
Currently, **74 optimizers (+ `bitsandbytes`, `qgalore`)**, **16 lr schedulers**, and **13 loss functions** are supported!
13+
Currently, **75 optimizers (+ `bitsandbytes`, `qgalore`)**, **16 lr schedulers**, and **13 loss functions** are supported!
1414

1515
Highly inspired by [pytorch-optimizer](https://github.com/jettify/pytorch-optimizer).
1616

@@ -172,6 +172,7 @@ supported_optimizers = get_supported_optimizers()
172172
| StableAdamW | *Stable and low-precision training for large-scale vision-language models* | | <https://arxiv.org/abs/2304.13013> | [cite](https://ui.adsabs.harvard.edu/abs/2023arXiv230413013W/exportcitation) |
173173
| AdamMini | *Use Fewer Learning Rates To Gain More* | [github](https://github.com/zyushun/Adam-mini) | <https://arxiv.org/abs/2406.16793> | [cite](https://github.com/zyushun/Adam-mini?tab=readme-ov-file#citation) |
174174
| TRAC | *Adaptive Parameter-free Optimization* | [github](https://github.com/ComputationalRobotics/TRAC) | <https://arxiv.org/abs/2405.16642> | [cite](https://ui.adsabs.harvard.edu/abs/2024arXiv240516642M/exportcitation) |
175+
| AdamG | *Towards Stability of Parameter-free Optimization* | | <https://arxiv.org/abs/2405.04376> | [cite](https://ui.adsabs.harvard.edu/abs/2024arXiv240504376P/exportcitation) |
175176

176177
## Supported LR Scheduler
177178

docs/optimizer.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -44,6 +44,10 @@
4444
:docstring:
4545
:members:
4646

47+
::: pytorch_optimizer.AdamG
48+
:docstring:
49+
:members:
50+
4751
::: pytorch_optimizer.AdaMod
4852
:docstring:
4953
:members:

0 commit comments

Comments
 (0)