A library providing type annotations and runtime type-checking for the shape and dtype of JAX/PyTorch/NumPy/MLX/TensorFlow arrays and tensors.
The name 'jax'typing is now historical, we support all of the above and have no JAX dependency!
from jaxtyping import Float
from torch import Tensor
# Accepts floating-point 2D arrays with matching axes
def matrix_multiply(x: Float[Tensor, "dim1 dim2"],
y: Float[Tensor, "dim2 dim3"]
) -> Float[Tensor, "dim1 dim3"]:
...pip install jaxtypingRequires Python 3.10+.
The annotations provided by jaxtyping are compatible with runtime type-checking packages, so it is common to also install one of these. The two most popular are typeguard (which exhaustively checks every argument) and beartype (which checks random pieces of arguments).
Available at https://docs.kidger.site/jaxtyping.
Always useful
Equinox: neural networks and everything not already in core JAX!
Deep learning
Optax: first-order gradient (SGD, Adam, ...) optimisers.
Orbax: checkpointing (async/multi-host/multi-device).
Levanter: scalable+reliable training of foundation models (e.g. LLMs).
paramax: parameterizations and constraints for PyTrees.
Scientific computing
Diffrax: numerical differential equation solvers.
Optimistix: root finding, minimisation, fixed points, and least squares.
Lineax: linear solvers.
BlackJAX: probabilistic+Bayesian sampling.
sympy2jax: SymPy<->JAX conversion; train symbolic expressions via gradient descent.
PySR: symbolic regression. (Non-JAX honourable mention!)
Awesome JAX
Awesome JAX: a longer list of other JAX projects.