diff --git a/README.md b/README.md index 65feb39..671f8a6 100644 --- a/README.md +++ b/README.md @@ -48,6 +48,7 @@ Available at [https://docs.kidger.site/jaxtyping](https://docs.kidger.site/jaxty [Optax](https://github.com/deepmind/optax): first-order gradient (SGD, Adam, ...) optimisers. [Orbax](https://github.com/google/orbax): checkpointing (async/multi-host/multi-device). [Levanter](https://github.com/stanford-crfm/levanter): scalable+reliable training of foundation models (e.g. LLMs). +[paramax](https://github.com/danielward27/paramax): parameterizations and constraints for PyTrees. **Scientific computing** [Diffrax](https://github.com/patrick-kidger/diffrax): numerical differential equation solvers. diff --git a/docs/index.md b/docs/index.md index 854518b..5c3f06d 100644 --- a/docs/index.md +++ b/docs/index.md @@ -50,6 +50,7 @@ Have a read of the [Array annotations](./api/array.md) documentation on the left [Optax](https://github.com/deepmind/optax): first-order gradient (SGD, Adam, ...) optimisers. [Orbax](https://github.com/google/orbax): checkpointing (async/multi-host/multi-device). [Levanter](https://github.com/stanford-crfm/levanter): scalable+reliable training of foundation models (e.g. LLMs). +[paramax](https://github.com/danielward27/paramax): parameterizations and constraints for PyTrees. **Scientific computing** [Diffrax](https://github.com/patrick-kidger/diffrax): numerical differential equation solvers.