Skip to content

Commit 4a095ae

Browse files
authored
Merge pull request #242 from kozistr/feature/fadam-optimizer
[Feature] Implement FAdam optimizer
2 parents 17893ed + 07e4a3c commit 4a095ae

File tree

12 files changed

+241
-91
lines changed

12 files changed

+241
-91
lines changed

README.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@
1010

1111
**pytorch-optimizer** is optimizer & lr scheduler collections in PyTorch.
1212
I just re-implemented (speed & memory tweaks, plug-ins) the algorithm while based on the original paper. Also, It includes useful and practical optimization ideas.
13-
Currently, **67 optimizers (+ `bitsandbytes`)**, **11 lr schedulers**, and **13 loss functions** are supported!
13+
Currently, **68 optimizers (+ `bitsandbytes`)**, **11 lr schedulers**, and **13 loss functions** are supported!
1414

1515
Highly inspired by [pytorch-optimizer](https://github.com/jettify/pytorch-optimizer).
1616

@@ -164,6 +164,7 @@ supported_optimizers = get_supported_optimizers()
164164
| Adalite | *Adalite optimizer* | [github](https://github.com/VatsaDev/adalite) | <https://github.com/VatsaDev/adalite> | [cite](https://github.com/VatsaDev/adalite) |
165165
| bSAM | *SAM as an Optimal Relaxation of Bayes* | [github](https://github.com/team-approx-bayes/bayesian-sam) | <https://arxiv.org/abs/2210.01620> | [cite](https://ui.adsabs.harvard.edu/abs/2022arXiv221001620M/exportcitation) |
166166
| Schedule-Free | *Schedule-Free Optimizers* | [github](https://github.com/facebookresearch/schedule_free) | <https://github.com/facebookresearch/schedule_free> | [cite](https://github.com/facebookresearch/schedule_free) |
167+
| FAdam | *Adam is a natural gradient optimizer using diagonal empirical Fisher information* | [github](https://github.com/lessw2020/fadam_pytorch) | <https://arxiv.org/abs/2405.12807> | [cite](https://ui.adsabs.harvard.edu/abs/2024arXiv240512807H/exportcitation) |
167168

168169
## Supported LR Scheduler
169170

docs/changelogs/v3.0.1.md

Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,15 @@
1+
## Change Log
2+
3+
### Feature
4+
5+
* Implement `FAdam` optimizer. (#241, #242)
6+
* [Adam is a natural gradient optimizer using diagonal empirical Fisher information](https://arxiv.org/abs/2405.12807)
7+
8+
### Bug
9+
10+
* Wrong typing of reg_noise. (#239, #240)
11+
* Lookahead`s param_groups attribute is not loaded from checkpoint. (#237, #238)
12+
13+
## Contributions
14+
15+
thanks to @michaldyczko

docs/index.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@
1010

1111
**pytorch-optimizer** is optimizer & lr scheduler collections in PyTorch.
1212
I just re-implemented (speed & memory tweaks, plug-ins) the algorithm while based on the original paper. Also, It includes useful and practical optimization ideas.
13-
Currently, **67 optimizers (+ `bitsandbytes`)**, **11 lr schedulers**, and **13 loss functions** are supported!
13+
Currently, **68 optimizers (+ `bitsandbytes`)**, **11 lr schedulers**, and **13 loss functions** are supported!
1414

1515
Highly inspired by [pytorch-optimizer](https://github.com/jettify/pytorch-optimizer).
1616

@@ -164,6 +164,7 @@ supported_optimizers = get_supported_optimizers()
164164
| Adalite | *Adalite optimizer* | [github](https://github.com/VatsaDev/adalite) | <https://github.com/VatsaDev/adalite> | [cite](https://github.com/VatsaDev/adalite) |
165165
| bSAM | *SAM as an Optimal Relaxation of Bayes* | [github](https://github.com/team-approx-bayes/bayesian-sam) | <https://arxiv.org/abs/2210.01620> | [cite](https://ui.adsabs.harvard.edu/abs/2022arXiv221001620M/exportcitation) |
166166
| Schedule-Free | *Schedule-Free Optimizers* | [github](https://github.com/facebookresearch/schedule_free) | <https://github.com/facebookresearch/schedule_free> | [cite](https://github.com/facebookresearch/schedule_free) |
167+
| FAdam | *Adam is a natural gradient optimizer using diagonal empirical Fisher information* | [github](https://github.com/lessw2020/fadam_pytorch) | <https://arxiv.org/abs/2405.12807> | [cite](https://ui.adsabs.harvard.edu/abs/2024arXiv240512807H/exportcitation) |
167168

168169
## Supported LR Scheduler
169170

docs/optimizer.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -136,6 +136,10 @@
136136
:docstring:
137137
:members:
138138

139+
::: pytorch_optimizer.FAdam
140+
:docstring:
141+
:members:
142+
139143
::: pytorch_optimizer.Fromage
140144
:docstring:
141145
:members:

0 commit comments

Comments
 (0)