Earth’s Mysterious Interior Revealed by HPC-Powered “Cartographers”

April 19, 2018

The earliest maps of the world date to classical antiquity, an era when many still conceived of the Earth as flat. We’ve come a long way since in our understanding, and yet the ground beneath our feet has remained ever-mysterious--until now. Using the high performance computing machines with the Princeton Institute for Computational Science and Engineering (PICSciE), and the fastest computers in the nation, including Titan at Oak Ridge National Laboratory, Professor Jeroen Tromp and his team are creating 3D, high-definition maps of the Earth’s mantle. Dr. Tromp is the Blair Professor of Geology and Professor of Geosciences and Applied and Computational Mathematics. In addition, he serves as Associate Director of PICSciE.

“We are like cartographers of the Earth’s interior,” Tromp explains. 

In part, this widening view into the Earth’s deepest reaches is powered by earthquakes, and the behavior of seismic waves. New methods allow Tromp to assimilate data from every earthquake that has occurred over the last ten years with a magnitude of 5.5 or greater, as captured by the modern seismographic network. 

Not only does this usher in a new era of insight regarding seismic activity at a global scale. It allows Tromp to run simulations that constrain the Earth’s features based on the incredible amount of data hidden in those waves and every object with which they interact. Never before has such a project been possible. Without supercomputers, the data is simply too vast to process.

“We use all of that data, I mean everything in the time series, to try and constrain earth’s structure,” Tromp says, his wonder evident. “It’s an astonishing time, really. We can do what’s called full waveform inversion, where every wiggle, every arrival, every wave that comes into a seismographic station, no matter how it’s traveled through the interior, can be used.”

Tromp likens the emerging view to a CAT scan of the Earth’s mantle. The first maps are expected to be published in 2016.

The project found its roots in the work of Hejun Zhun, a former graduate student now working as a postdoc at UT Austin. Under Tromp’s guidance, Zhu launched an ambitious project to analyze data captured from 200 earthquakes in Europe.

“He really took these techniques that we’ve been working on to the next level,” Tromp says. “We imaged Europe, but we looked at many different things. So, wave-speed variation, but also what is called anisotropy, the directional dependence of the wave speed. Or a parameter that measures how waves are attenuated and lose energy, the quality factor.” 

The project involved 120,000 pieces of data. Similar work at an even higher magnitude followed, harnessing seismic data from China Array. That data set included 2.5 million pieces of information. Now, Tromp is applying the process at a global scale, which involves “massive, massive data assimilation.”

“The science has become fully numerical,” Tromp says. “We’ve made this transition from looking at very approximate methods and synthetic methods and sort of paper-and-pencil methods--very simplistic assumptions about the basic structure of the earth--to fully three-dimensional numerical methods.”

Data visualization also plays a large role. As Tromp sees it, “visualization is absolutely key for what we do. That’s how scientific discoveries are ultimately made.”

The implications of these discoveries involve more than satisfying a curiosity as old as mankind, an accomplishment in its own right. From better understanding earthquakes and modeling volcanic activity to focusing expensive drilling operations in the energy industry, many have a stake in gaining better understand of the world beneath us. 

“Drilling a well can cost half a billion dollars,” Tromp says. “The more seismologists can do to help with that, and to image as precisely as possible subsurface, that’s a very practical example of how these techniques are brought to bear.”

The data also provides a window into how the Earth evolved. 

“In those images, you can really see how one plate was pushed underneath another, what happened to that plate as it was pushed back into the mantle, where did it end up?” Tromp explains. “So these basic scientific questions, this curiosity-driven research: How does the Earth work as a planet. As a heat engine? Seismology is the only way to map what’s happening below the surface.”

On campus, Tromp’s team runs simulations involving several-thousand cores. For larger projects, he leverages a major time award on Titan. In 2016, Tromp was granted 80 million processor hours on the national supercomputer to pursue a project titled “Global Adjoint Tomography.” Via parallel processing, TITAN  can run up to 20 quadrillion calculations per second.

Given his unique perspective on the revolutionary power of advanced research computing, Tromp is a tireless advocate for maintaining world-class high-performance computing resources, training, and education on campus. As Associate Director of PICSciE, he plays an active role in guiding the campus conversation regarding the advancement of groundbreaking research by providing cutting edge computational resources. 

“The facilities that we need range from what PICSciE has to offer, all the way to the largest computers in the world,” Professor Tromp explains, noting the importance of Princeton’s support for HPC across fields. “All of my graduate students  are heavily involved in HPC in one form or another,” Tromp says. “Scientific computing is a critical part of their education.”

Learn more about PICSciE’s HPC resources, and visit the Theoretical & Computational Seismology group to further explore Professor Tromp and his team’s research work.