ML programming may become more organized, methodical, and logical with JAX. Although fundamentally different in architecture, it might take the place of PyTorch and TensorFlow.
Machine learning researchers are enthusiastic about JAX because it makes machine learning programming simple, organized, and tidy. Moreover, machine learning researchers can employ a set of composable function transformations that it has.
Let’s look at a few fresh JAX libraries today:
NumPyro
Pyro has a NumPy backend thanks to the modest probabilistic programming package known as NumPyro. For automatic differentiation and JIT compilation to GPU/CPU, they rely on JAX. Although NumPyro is still under development, there may be problems and API modifications as the design evolves.
Jax-unirep
The UniRep model was created in George Church’s lab. The repository where the original model is preserved can be found here (bioRxiv) or here (Nature Methods), respectively.
This repository is a flexible and self-contained implementation of the UniRep concept. Moreover, it provides other useful APIs that support protein engineering operations.
TensorLy
A Python package called TensorLy seeks to make understanding tensors simple. Tensor decomposition, tensor learning, and tensor algebra are all made simple by it. Its backend system enables you to run methods and do computations at scale on the CPU or GPU using NumPy, PyTorch, JAX, MXNet, TensorFlow, or CuPy.
Fortuna
Fortuna is a library for assessing uncertainty that makes it simple for users to run tests and delay systems that are used in real-world applications. Models that have already been trained and created in any framework can be used with calibration and conformal methods in Fortuna. Moreover, it offers several Bayesian inference techniques that may be applied to Flax-written deep learning models. For those who need to learn more about uncertainty quantification, the language is simple to understand and can be configured in a variety of ways.
Cvxpylayers
For creating differentiable convex optimization layers in PyTorch, JAX, and TensorFlow using CVXPY, there is a Python package called cvxpylayers. A parameterized convex optimization problem is solved in the forward pass by a layer of convex optimization. It determines the derivative of the answer in terms of the parameters in the backward pass.
This library complements our work for NeurIPS 2019 regarding various convex optimization layers.
Optax
Optax is a Java library that enables the processing and optimization of gradients. It is designed to facilitate research by providing building pieces that may be quickly and readily assembled in a variety of ways.
Their objective is to
Deliver straightforward, tried-and-true implementations of the fundamental components.
, a (or other gradient processing components).
Make it simple for anyone to participate, and new concepts will spread more quickly.
MLP equivariant
Equivariant MLP(EMLP) is a Realistic Method for Equivariant Multilayer Perceptrons Construction for Any Arbitrary Matrix Groups. EMLP is a Java library for deep learning that automatically generates equivariant layers.
It performs the subsequent functions:
Calculate equivariant linear layers between finite-dimensional representations. You provide the representations and the symmetry group (discrete, continuous, non-compact, complex) (tensors, irreducible, induced representations, etc.).
Comprehensive equivariant models generated automatically from small data: You might be able to utilize EMLP as a turnkey solution for generating the model, or at the very least as a reliable baseline, if your inputs and outputs (and desired features) are a modest collection of elements such as scalars, vectors, tensors, and irreps with a total dimension of less than 1000.