Skip to content

Commit c003442

Browse files
committed
Add paramax
1 parent 36f8116 commit c003442

File tree

2 files changed

+3
-1
lines changed

2 files changed

+3
-1
lines changed

README.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -89,7 +89,8 @@ If you found this library to be useful in academic work, then please cite: ([arX
8989
**Deep learning**
9090
[Optax](https://github.com/deepmind/optax): first-order gradient (SGD, Adam, ...) optimisers.
9191
[Orbax](https://github.com/google/orbax): checkpointing (async/multi-host/multi-device).
92-
[Levanter](https://github.com/stanford-crfm/levanter): scalable+reliable training of foundation models (e.g. LLMs).
92+
[Levanter](https://github.com/stanford-crfm/levanter): scalable+reliable training of foundation models (e.g. LLMs).
93+
[paramax](https://github.com/danielward27/paramax): parameterizations and constraints for PyTrees.
9394

9495
**Scientific computing**
9596
[Diffrax](https://github.com/patrick-kidger/diffrax): numerical differential equation solvers.

docs/index.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -79,6 +79,7 @@ If this quick start has got you interested, then have a read of [All of Equinox]
7979
[Optax](https://github.com/deepmind/optax): first-order gradient (SGD, Adam, ...) optimisers.
8080
[Orbax](https://github.com/google/orbax): checkpointing (async/multi-host/multi-device).
8181
[Levanter](https://github.com/stanford-crfm/levanter): scalable+reliable training of foundation models (e.g. LLMs).
82+
[paramax](https://github.com/danielward27/paramax): parameterizations and constraints for PyTrees.
8283

8384
**Scientific computing**
8485
[Diffrax](https://github.com/patrick-kidger/diffrax): numerical differential equation solvers.

0 commit comments

Comments
 (0)