M8 Earthquake Simulation Breaks Computational Records, Promises Better Quake Models

Nov. 18, 2010 -- A multi-disciplinary team of researchers presented the world’s most advanced earthquake shaking simulation at the Supercomputing 2010 (SC10) conference this week in New Orleans. The research was selected as a finalist for the Gordon Bell Prize, awarded at the annual conference for outstanding achievement in high-performance computing applications.

The "M8” simulation models how the ground will shake in a magnitude 8.0 earthquake on the southern San Andreas Fault. The simulation covers a larger area, in greater detail, than previously possible. Perhaps most importantly, the development of the M8 simulation advances the state-of-the-art in terms of the speed and efficiency at which such calculations can be performed.

The Southern California Earthquake Center (SCEC) at the University of Southern California (USC) was the lead coordinator in the project. San Diego Supercomputer Center (SDSC) researchers provided the high-performance computing and scientific visualization expertise for the simulation. Scientific details of the earthquake were developed by scientists at San Diego State University (SDSU). Ohio State University (OSU) researchers were also part of the collaborative effort to improve the efficiency of the software involved.

While an earthquake of magnitude 8.0 or larger in southern California is unlikely to occur (2% chance in the next 30 years), the technological improvements required to produce this simulation allow scientists now to simulate other more likely earthquakes scenarios in much less time than previously required. Because such simulations are the most important and widespread applications of high performance computing for seismic hazard estimation currently in use, the SCEC team has been focused on optimizing the technologies and codes needed to create them.

Funded through a number of National Science Foundation (NSF) grants, the M8 simulation was performed using supercomputer resources including NSF’s Kraken supercomputer at the National Institute for Computational Science (NICS) and the Department of Energy sponsored (DOE) Jaguar supercomputer at the National Center for Computational Science (NCCS). The SCEC M8 simulation represents the latest in earthquake science and in computations at the petascale level, which refers to supercomputers capable of more than one quadrillion floating point operations (calculations) per second.

"Petascale simulations such as this one are needed to understand the rupture and wave dynamics of the largest earthquakes, at shaking frequencies required to engineer safe structures,” said Thomas Jordan, director of SCEC and Principal Investigator for the project. Unlike previous simulations, which were useful only for modeling how tall structures will behave in earthquakes, the new simulation can be used to understand how a broader range of buildings will respond.

Animation of the ground motion after the Magnitude 8 simulation has initiated on the San Andreas fault near Parkfield. The waves are still shaking in the Ventura, Los Angeles, and San Bernardino areas (due to reverberations in their underlying soft sedimentary basins) while the rupture is still in progress, and the wave fronts are rapidly approaching San Diego. Visualization by Amit Chourasia, San Diego Supercomputer Center, UC San Diego.

"The scientific results of this massive simulation are very interesting, and its level of detail has allowed us to observe things that we were not able to see in the past,” said Kim Olsen, professor of geological sciences at SDSU, and lead seismologist of the study. Olsen noted that high-rise buildings are more susceptible to the low-frequency, or a roller-coaster-like motion, while smaller structures usually suffer more damage from the higher-frequency shaking, which feels more like a series of sudden jolts.

"The problem we face in earthquake science is that we actually don’t have recordings of very large earthquakes needed to predict the ground motions of the next one,” Jordan said. The best option is then to create scenarios of such earthquakes based on as many details of the earth and the physics involved as possible, and for a broader a range of frequencies, as was done in the M8 simulation. The results are both broadly useful data as well basic research insights into earthquake processes.

"Earthquakes have a lot of variability,” Jordan said. "M8 is one particular guess of what an earthquake would look like. We are developing the technologies and resources to model many such scenarios.”

Given the massive number of calculations required, only the most advanced supercomputers are capable of producing such simulations in a reasonable time period. "This M8 simulation represents a milestone calculation, a breakthrough in seismology both in terms of computational size and scalability,” said Yifeng Cui, a computational scientist at SDSC. "It’s also the largest and most detailed simulation of a major earthquake ever performed in terms of floating point operations, and opens up new territory for earthquake science and engineering with the goal of reducing the potential for loss of life and property.”

Specifically, the M8 simulation is the largest in terms duration of the shaking modeled (six minutes) and the geographical area covered – a rectangular volume approximately 500 miles (810km) long by 250 miles (405 km) wide, by 50 miles (85km) deep. The team’s latest research also set a new record in the number of computer processor cores used, with 223,074 cores sustaining performance of 220 trillion calculations per second for 24 hours on the Jaguar Cray XT5 supercomputer at the Oak Ridge National Laboratory (ORNL) in Tennessee. By comparison, the 2004 TeraShake earthquake simulation of a smaller San Andreas Fault earthquake used only 240 cores and took over four days to run. Additionally, the new simulation calculated shaking at over 436 billion mesh points to calculate the potential effect of such an earthquake, versus only 1.8 billion mesh points used in the TeraShake simulations. The total data produced is approximately 500 terabytes (1 terabyte equals 1,000 gigabytes).

To produce the M8 simulation, researchers within the SCEC Community Modeling Environment (SCEC/CME) project developed a highly scalable, parallel earthquake wave propagation code called AWP-ODC. Previous AWP-ODC groundbreaking science results include TeraShake, multiple ShakeOut simulations (used as the basis for a Southern California earthquake drill that involved more than 5.4 million people), and a Pacific Northwest megathrust scenario (forecasting up to 5 minutes of shaking in Seattle for such an event).

"We have come a long way in just six years, doubling the seismic frequencies modeled by our simulations every two to three years, from 0.5 Hertz (or cycles per second) in the TeraShake simulations, to 1.0 Hertz in the ShakeOut simulations, and now to 2.0 Hertz in this latest project,” said Phil Maechling, SCEC’s associate director for Information Technology.

In terms of earthquake science, these simulations can be used to study how earthquake waves travel through structures in the earth’s crust and to improve three-dimensional models of such structures.

"Based on our calculations, we are finding that deep sedimentary basins, such as those in the Los Angeles area, are getting larger shaking than are predicted by the standard methods,” Jordan said. "By improving the predictions, making them more realistic, we can help engineers make new buildings safer.” The simulations are also useful in developing better seismic hazard policies and for improving scenarios used in emergency planning.

As a follow-on to the record-setting simulation, Olsen said the research team plans later this year to analyze potential damage to buildings, including Los Angeles high-rises, due to the simulated ground motions.

In addition to Cui, Olsen, Jordan, and Maechling, other researchers on the Scalable Earthquake Simulation on Petascale Supercomputers project (which resulted in the M8 simulation) include Amit Chourasia, Kwangyoon Lee, and Jun Zhou from SDSC; Daniel Roten and Steven M. Day from SDSU; Geoffrey Ely and Patrick Small from USC; D.K. Panda and his team from OSU; and John Levesque, from Cray Inc.

About SCEC
The Southern California Earthquake Center (SCEC) is a community of over 600 scientists, students, and others at more than 60 institutions worldwide, headquartered at the University of Southern California. SCEC is funded by the National Science Foundation and the U.S. Geological Survey to develop a comprehensive understanding of earthquakes in Southern California and elsewhere, and to communicate knowledge for reducing earthquake risk. SDSU is one of several core institutions involved with SCEC.

About SDSC
As an organized research unit of UC San Diego, the San Diego Supercomputer Center (SDSC) is a national leader in creating and providing cyberinfrastructure for data-intensive research. Cyberinfrastructure refers to an accessible and integrated network of computer-based resources and expertise, focused on accelerating scientific inquiry and discovery. SDSC plans to build the high-performance computing community’s first flash memory-based supercomputer system named Gordon, to enter operation in 2011. SDSC is a founding member of TeraGrid, the nation's largest open-access scientific discovery infrastructure.

Comments:
Yifeng Cui, SDSC
(858) 822-0916 or yfcui@sdsc.edu
Kim Olsen, SDSU
(619) 804-9253 or kbolsen@sciences.sdsu.edu
Philip Maechling, SCEC
(213) 821-2491 or maechlin@usc.edu.

Media Contacts:
Mark Benthien, SCEC Communications
(213) 740-0323 or benthien@usc.edu
Jan Zverina, SDSC Communications
858 534-5111 or jzverina@sdsc.edu
Warren R. Froelich, SDSC Communications
858 822-3622 or froelich@sdsc.edu
Gina Jacobs, SDSU Communications
(619) 594-4563 or gina.jacobs@sdsu.edu