JAX (software)

JAX
Developer(s)Google, Nvidia[1]
Preview release
v0.4.31 / 30 July 2024 (2024-07-30)
Repositoryjax on GitHub
Written inPython, C++
Operating systemLinux, macOS, Windows
PlatformPython, NumPy
Size9.0 MB
TypeMachine learning
LicenseApache 2.0
Websitejax.readthedocs.io/en/latest/ Edit this on Wikidata

JAX is a Python library for accelerator-oriented array computation and program transformation, designed for high-performance numerical computing and large-scale machine learning. It is developed by Google with contributions from Nvidia and other community contributors.[2][3][4]

It is described as bringing together a modified version of autograd (automatic obtaining of the gradient function through differentiation of a function) and OpenXLA's XLA (Accelerated Linear Algebra). It is designed to follow the structure and workflow of NumPy as closely as possible and works with various existing frameworks such as TensorFlow and PyTorch.[5][6] The primary features of JAX are:[7]

  1. Providing a unified NumPy-like interface to computations that run on CPU, GPU, or TPU, in local or distributed settings.
  2. Built-in Just-In-Time (JIT) compilation via Open XLA, an open-source machine learning compiler ecosystem.
  3. Efficient evaluation of gradients via its automatic differentiation transformations.
  4. Automatically vectorized to efficiently map them over arrays representing batches of inputs.
  1. ^ "jax/AUTHORS at main · jax-ml/jax". GitHub. Retrieved December 21, 2024.
  2. ^ Bradbury, James; Frostig, Roy; Hawkins, Peter; Johnson, Matthew James; Leary, Chris; MacLaurin, Dougal; Necula, George; Paszke, Adam; Vanderplas, Jake; Wanderman-Milne, Skye; Zhang, Qiao (2022-06-18), "JAX: Autograd and XLA", Astrophysics Source Code Library, Google, Bibcode:2021ascl.soft11002B, archived from the original on 2022-06-18, retrieved 2022-06-18
  3. ^ Frostig, Roy; Johnson, Matthew James; Leary, Chris (2018-02-02). "Compiling machine learning programs via high-level tracing" (PDF). MLsys: 1–3. Archived (PDF) from the original on 2022-06-21.{{cite journal}}: CS1 maint: date and year (link)
  4. ^ "Using JAX to accelerate our research". www.deepmind.com. Archived from the original on 2022-06-18. Retrieved 2022-06-18.
  5. ^ Lynley, Matthew. "Google is quietly replacing the backbone of its AI product strategy after its last big push for dominance got overshadowed by Meta". Business Insider. Archived from the original on 2022-06-21. Retrieved 2022-06-21.
  6. ^ "Why is Google's JAX so popular?". Analytics India Magazine. 2022-04-25. Archived from the original on 2022-06-18. Retrieved 2022-06-18.
  7. ^ "Quickstart — JAX documentation".

© MMXXIII Rich X Search. We shall prevail. All rights reserved. Rich X Search