![]() | This article may contain an excessive amount of intricate detail that may interest only a particular audience. (February 2025) |
JAX | |
---|---|
![]() JAX logo | |
![]() | |
Developer(s) | Google, Nvidia[1] |
Preview release | v0.4.31
/ 30 July 2024 |
Repository | jax on GitHub |
Written in | Python, C++ |
Operating system | Linux, macOS, Windows |
Platform | Python, NumPy |
Size | 9.0 MB |
Type | Machine learning |
License | Apache 2.0 |
Website | jax![]() |
JAX is a Python library for accelerator-oriented array computation and program transformation, designed for high-performance numerical computing and large-scale machine learning. It is developed by Google with contributions from Nvidia and other community contributors.[2][3][4]
It is described as bringing together a modified version of autograd (automatic obtaining of the gradient function through differentiation of a function) and OpenXLA's XLA (Accelerated Linear Algebra). It is designed to follow the structure and workflow of NumPy as closely as possible and works with various existing frameworks such as TensorFlow and PyTorch.[5][6] The primary features of JAX are:[7]
{{cite journal}}
: CS1 maint: date and year (link)
© MMXXIII Rich X Search. We shall prevail. All rights reserved. Rich X Search