TensorFlow and PyTorch User Group

Thursday, July 25, 2019 at 4:30 pm in Lewis Science Library 138


There will be one lightning talk and two 20-minute talks.

GPU Computing with R and Keras

Danny Simpson, Graduate Student, Lewis-Sigler Institute for Integrative Genomics (LSI), Faculty Advised by Britt Adamson and Barbara Engelhardt, Princeton University

The R interface to Keras combines the high-level approach to designing deep neural networks of Keras with the straightforward data processing capabilities of R. However, despite Keras’s ability to use either CPU or GPU nodes with the same code, GPU computing with this interface requires changes to both scripts and working environment. In this lightning talk, I present the MNIST classification tutorial using the R interface to Keras utilizing a GPU node on Princeton’s Adroit cluster.
 

Big Data of Big Tissues: Deep Neural Networks to Accelerate Analysis of Collective Cell Behaviors

Julienne LaChance, Graduate Student, Mechanical and Aerospace Engineering (MAE), Faculty Advised by Daniel Cohen, Princeton University

Coordinated cellular motion is crucial for proper tissue organization and function. The analysis of massive, living tissues can provide key insights into these behaviors but requires more versatile feature extraction approaches. Convolutional neural networks (CNNs) have been gaining in popularity among cell biologists for tasks such as object classification (as in the identification of cell phenotypes) and segmentation (for the detection of individual cells or nuclei). I will demonstrate the use of a U-Net style architecture for reconstructing UV-excited labels directly from low-magnification transmitted light images of cells. I will also present preliminary findings on the prediction of complex cell behaviors using deeper CNNs.

 

Selene: A PyTorch-based Deep Learning Library for Sequence Data

Kathleen Chen, Data Scientist, Center for Computational Biology, Flatiron Institute, Simons Foundation, New York, NY

Selene is a PyTorch-based deep learning library for fast and easy development, training, and application of deep learning model architectures for any biological sequences [1]. Selene was developed to increase the accessibility of deep learning in biology and facilitate the creation of reproducible workflows and results. The library contains modules for data sampling and training for model development, as well as prediction and visualization for analyses using a trained model. Furthermore, Selene is open-source software that will continue to be updated and expanded based on community feedback. In this talk, I will discuss how we designed Selene to support sequence-based deep learning across a broad range of biological questions and made the library accessible to users at different levels of computational proficiency. I will also explain the motivation for creating Selene and give some examples of how we currently use Selene in our research projects.

[1] K. M. Chen, E. M. Cofer, J. Zhou & O. G. Troyanskaya, Nature Methods 16, 315-318 (2019).

Flatiron Institute

Pizza will be served. Please RSVP for this talk by emailing Andrea Rubinstein (alrubins@princeton.edu).


 

Upcoming meeting

Thursday, September 12, 4:30-5:30 pm, 120 Lewis Science Library 
Pizza will be served.

JAX: accelerated machine-learning research via composable function transformations in Python

Peter Hawkins, Google AI Princeton

JAX is a system for high-performance machine learning research. It offers the familiarity of Python+NumPy and the speed of hardware accelerators, and it enables the definition and the composition of function transformations useful for machine-learning programs. In particular, these transformations include automatic differentiation, automatic batching, end-to-end-compilation (via XLA), and even parallelizing over multiple accelerators. They are the key to JAX's power and to its relative simplicity.

JAX had its initial open-source release in December 2018 (https://github.com/google/jax). It is currently being used by several groups of researchers for a wide range of advanced applications, from studying spectra of neural networks, to probabilistic programming and Monte Carlo methods, and scientific applications in physics and biology. Users appreciate JAX most of all for its ease of use and flexibility.

This talk is an introduction to JAX and a description of some of its technical aspects. It also includes a discussion of current strengths and limitations, and of our plans for the near future, which may be of interest to potential users and contributors.

Google AI


 

Previous meetings

June 2019

Deep Learning Frameworks at Princeton by Jonathan Halverson
Training deep convolutional neural networks by Michael Churchill
Opportunities and challenges in self-driving cars at NVIDIA by Timur Rvachov (slides not available)

 
 

About

The TensorFlow & PyTorch User Group will serve as a campus-wide platform for deep learning researchers to connect with one another to discuss their work and the use of the tools. Each meeting will feature one or more talks with ample time for discussion and networking. Coding help will be available. The group is open to all members of the PU research community including undergraduate students.

The group is sponsored by the Princeton Instiutue for Computational Science and Engineering (PICSciE) and the Center for Statistics and Machine Learning (CSML).

 

Contact

For more information please contact Jonathan Halverson (halverson@princeton.edu)