We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent 75c7945 commit 483a816Copy full SHA for 483a816
docs/changelogs/v3.8.0.md
@@ -2,6 +2,7 @@
2
3
### Feature
4
5
+* Implement `EmoNeco` and `EmoZeal` optimizers. (#407)
6
* Implement `Refined Schedule-Free AdamW` optimizer. (#409, #414)
7
* [Through the River: Understanding the Benefit of Schedule-Free Methods for Language Model Training](https://arxiv.org/abs/2507.09846)
8
* You can use this variant by setting `decoupling_c` parameter in the `ScheduleFreeAdamW` optimizer.
pyproject.toml
@@ -1,6 +1,6 @@
1
[tool.poetry]
name = "pytorch_optimizer"
-version = "3.7.0"
+version = "3.8.0"
description = "optimizer & lr scheduler & objective function collections in PyTorch"
license = "Apache-2.0"
authors = ["kozistr <[email protected]>"]
0 commit comments