Thursday, September 12, 4:30-5:30 pm, 120 Lewis Science Library
JAX: Accelerated machine-learning research via composable function transformations in Python
Peter Hawkins, Google AI Princeton
JAX is a system for high-performance machine learning research. It offers the familiarity of Python+NumPy and the speed of hardware accelerators, and it enables the definition and the composition of function transformations useful for machine-learning programs. In particular, these transformations include automatic differentiation, automatic batching, end-to-end-compilation (via XLA), and even parallelizing over multiple accelerators. They are the key to JAX's power and to its relative simplicity.
JAX had its initial open-source release in December 2018 (https://github.com/google/jax). It is currently being used by several groups of researchers for a wide range of advanced applications, from studying spectra of neural networks, to probabilistic programming and Monte Carlo methods, and scientific applications in physics and biology. Users appreciate JAX most of all for its ease of use and flexibility.
This talk is an introduction to JAX and a description of some of its technical aspects. It also includes a discussion of current strengths and limitations, and of our plans for the near future, which may be of interest to potential users and contributors.
Pizza will be served. Please RSVP for this talk by emailing Andrea Rubinstein (email@example.com).
Thursday, October 17, 4:30-5:30 pm, 138 Lewis Science Library
July 2019Selene: A PyTorch-based Deep Learning Library for Sequence Data by Kathleen Chen
Big data of big tissues: deep neural networks to accelerate analysis of collective cell behaviors in large populations by Julienne LaChance
GPU Computing with R and Keras by Danny Simpson
Announcements and TensorFlow 2 (beta) by Jonathan Halverson
June 2019Opportunities and challenges in self-driving cars at NVIDIA by Timur Rvachov (slides not available)
Training deep convolutional neural networks by Michael Churchill
Deep Learning Frameworks at Princeton by Jonathan Halverson
The TensorFlow & PyTorch User Group will serve as a campus-wide platform for deep learning researchers to connect with one another to discuss their work and the use of the tools. Each meeting will feature one or more talks with ample time for discussion and networking. Coding help will be available. The group is open to all members of the PU research community including undergraduate students.
The group is sponsored by the Princeton Instiutue for Computational Science and Engineering (PICSciE) and the Center for Statistics and Machine Learning (CSML).
For more information please contact Jonathan Halverson (firstname.lastname@example.org)