JAX: Accelerated machine learning research via composable function transformations in Python

Thu, Sep 12, 2019, 4:30 pm to 5:30 pm
Lewis Libary 120
PICSciE and Research Computing

JAX is a system for high-performance machine learning research. It offers the familiarity of Python+NumPy and the speed of hardware accelerators, and it enables the definition and the composition of function transformations useful for machine-learning programs. In particular, these transformations include automatic differentiation, automatic batching, end-to-end-compilation (via XLA), and even parallelizing over multiple accelerators. They are the key to JAX's power and to its relative simplicity.

JAX had its initial open-source release in December 2018 (https://github.com/google/jax). It is currently being used by several groups of researchers for a wide range of advanced applications, from studying spectra of neural networks, to probabilistic programming and Monte Carlo methods, and scientific applications in physics and biology. Users appreciate JAX most of all for its ease of use and flexibility.

This talk is an introduction to JAX and a description of some of its technical aspects. It also includes a discussion of current strengths and limitations, and of our plans for the near future, which may be of interest to potential users and contributors.


Questions? Email halverson@princeton.edu

Upcoming Trainings

Scientific Visualization

Fri, Dec 6, 2019, 2:00 pm to 3:30 pm

Research Data Management Workshop

Mon, Jan 27, 2020 (All day) to Wed, Jan 29, 2020 (All day)

Upcoming Help Sessions

Weekly Help Sessions

Tue, Dec 10, 2019, 10:30 am to 11:30 am

Weekly Help Sessions in Visualization & Programming

Thu, Dec 12, 2019, 2:00 pm to 3:00 pm

Other Events

GROMACS Users Meeting

Mon, Dec 16, 2019, 4:00 pm to 5:30 pm