The Virtual California approach to earthquake forecasting is similar to the computer models used for weather forecasting, said John Rundle, director of the UC Davis Computational Science and Engineering Center, who has developed the model with colleagues from the Jet Propulsion Laboratory and other institutions. A previous forecast of earthquake hazards, the Working Group on California Earthquake Probabilities, used records of past earthquakes to calculate the probability of future ones.
The Virtual California model includes 650 segments representing the major fault systems in California, including the San Andreas fault responsible for the 1906 San Francisco earthquake. The simulation takes into account the gradual movement of faults and how they interact with each other.
The researchers used the model to simulate 40,000 years of earthquakes in California. They found almost 400 major (magnitude 7 or above) earthquakes at an average interval of 101 years. The simulation data indicates a 25 percent chance of another such earthquake in the next 20 years, a 50 percent chance in the next 45 years and a 75 percent chance by 2086.
The latest work is published in Proceedings of the National Academy of Sciences of the USA. Other authors on the paper are Paul Rundle, Donald Turcotte, Robert Scherbakov and Gleb Yakovlev at UC Davis; Andrea Donellan, Peggy Li and Jay Parker, Jet Propulsion Laboratory; Bruce Malamud, King's College, London; Lisa Grant, UC Irvine; Geoffrey Fox, Indiana University, Bloomington; Dennis McLeod, University of Southern California; Bill Klein, Boston University; and Kristy Tiampo, University of Western Ontario, Canada.
Cite This Page: