Getting started¤
jaxtyping is a library providing type annotations and runtime type-checking for:
(Now also supports PyTorch, NumPy, and TensorFlow!)
Installation¤
pip install jaxtyping
Requires Python 3.9+.
JAX is an optional dependency, required for a few JAX-specific types. If JAX is not installed then these will not be available, but you may still use jaxtyping to provide shape/dtype annotations for PyTorch/NumPy/TensorFlow/etc.
The annotations provided by jaxtyping are compatible with runtime type-checking packages, so it is common to also install one of these. The two most popular are typeguard (which exhaustively checks every argument) and beartype (which checks random pieces of arguments).
Example¤
from jaxtyping import Array, Float, PyTree
# Accepts floating-point 2D arrays with matching axes
def matrix_multiply(x: Float[Array, "dim1 dim2"],
y: Float[Array, "dim2 dim3"]
) -> Float[Array, "dim1 dim3"]:
...
def accepts_pytree_of_ints(x: PyTree[int]):
...
def accepts_pytree_of_arrays(x: PyTree[Float[Array, "batch c1 c2"]]):
...
Next steps¤
Have a read of the Array annotations documentation on the left-hand bar!
See also: other libraries in the JAX ecosystem¤
Always useful
Equinox: neural networks and everything not already in core JAX!
Deep learning
Optax: first-order gradient (SGD, Adam, ...) optimisers.
Orbax: checkpointing (async/multi-host/multi-device).
Levanter: scalable+reliable training of foundation models (e.g. LLMs).
Scientific computing
Diffrax: numerical differential equation solvers.
Optimistix: root finding, minimisation, fixed points, and least squares.
Lineax: linear solvers.
BlackJAX: probabilistic+Bayesian sampling.
sympy2jax: SymPy<->JAX conversion; train symbolic expressions via gradient descent.
PySR: symbolic regression. (Non-JAX honourable mention!)
Awesome JAX
Awesome JAX: a longer list of other JAX projects.