Google JAX

Google JAX is a machine learning framework for transforming numerical functions, to be used in Python. It is described as bringing together a modified version of autograd (automatic obtaining of the gradient function through differentiation of a function) and TensorFlow's XLA (Accelerated Linear Algebra). It is designed to follow the structure and workflow of NumPy as closely as possible and works with various existing frameworks such as TensorFlow and PyTorch. The primary functions of JAX are:

  1. grad: automatic differentiation
  2. jit: compilation
  3. vmap: auto-vectorization
  4. pmap: SPMD programming
JAX
Developer(s)Google
Stable release
0.4.24  / 6 February 2024 (6 February 2024)
Repositorygithub.com/google/jax
Written inPython, C++
Operating systemLinux, macOS, Windows
PlatformPython, NumPy
Size9.0 MB
TypeMachine learning
LicenseApache 2.0
Websitejax.readthedocs.io/en/latest/ 
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.