Skip to content

Commit 6008a00

Browse files
committed
docs: v2.10.0 changelog
1 parent 30b7a25 commit 6008a00

File tree

1 file changed

+5
-3
lines changed

1 file changed

+5
-3
lines changed

docs/changelogs/v.2.10.0.md

Lines changed: 5 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -4,13 +4,15 @@
44

55
* Implement Amos optimizer (#174)
66
* [An Adam-style Optimizer with Adaptive Weight Decay towards Model-Oriented Scale](https://arxiv.org/abs/2210.11693)
7-
* Implement SingSGD optimizer (#176) (thanks to @i404788)
7+
* Implement SignSGD optimizer (#176) (thanks to @i404788)
88
* [Compressed Optimisation for Non-Convex Problems](https://arxiv.org/abs/1802.04434)
99
* Implement AdaHessian optimizer (#176) (thanks to @i404788)
1010
* [An Adaptive Second Order Optimizer for Machine Learning](https://arxiv.org/abs/2006.00719)
11-
* Implement SophiaH optimizer (#176) (thanks to @i404788)
11+
* Implement SophiaH optimizer (#173, #176) (thanks to @i404788)
1212
* [A Scalable Stochastic Second-order Optimizer for Language Model Pre-training](https://arxiv.org/abs/2305.14342)
13-
* Implement re-usable tools to compute hessian in `BaseOptimizer` (#176)
13+
* Implement re-usable functions to compute hessian in `BaseOptimizer` (#176, #177) (thanks to @i404788)
14+
* two types of distribution are supported (`gaussian`, `rademacher`).
15+
* Support `AdamD` variant for AdaHessian optimizer (#177)
1416

1517
### Diff
1618

0 commit comments

Comments
 (0)