Prediction Earthquake

download Prediction Earthquake

of 3

Transcript of Prediction Earthquake

  • The Prediction Problems of EarthquakeSystem Science

    Editors note: The following is the text of the SSA PresidentialAddress presented at the Annual Luncheon of the SeismologicalSociety of America (SSA) Annual Meeting on 30 April 2014.

    The Seismological Society of America (SSA) has alwaysbeen dedicated to understanding andreducing the earthquake threat. The So-ciety was founded in 1906 for the ac-quisition and diffusion of knowledgeconcerning earthquakes and allied phe-nomena. According to our new strategicplan, approved by the Board in 2012, thecore purpose of SSA is to advance seis-mology and the understanding of earth-quakes for the benefit of society. Thisplan lays out the vision for SSA to be the primary forumfor the assembly, exchange, and dissemination of scientificknowledge essential for an earthquake-aware and safer world.

    In the past twenty years or so, the study of earthquakes hasbecome a true system science, offering new pathways for theadvancement of seismology. Today I would like to explorewhat the rise of earthquake system science might imply forthe future of our field and for SSAs mission in earthquakeresearch.

    System science seeks to explain phenomena that emergefrom nature at the system scale, such as global climate changeor earthquake activity in California or Alaska. The system isnot a physical reality, but a hypothetical representation ofnature, typically a numerical model that replicates an emergentbehavior and predicts its future course.

    The choice of target behavior determines the systemmodel, as can be illustrated by two representations of earth-quake activity in California. One is UCERF3, the latest uni-form California earthquake rupture forecast of the WorkingGroup on California Earthquake Probabilities, which repre-sents future earthquake activity in terms of time-dependentfault-rupture probabilities. Another is the Southern CaliforniaEarthquake Center (SCEC)s CyberShake ground-motionmodel, which uses simulations to represent the probability offuture earthquake shaking at geographic sites, conditional onthe fault rupture. These two system-level models can be com-bined to generate site-specific hazard curves, the main forecast-ing tool of probabilistic seismic-hazard analysis (PSHA).

    The first point to emphasize is that earthquake system sci-ence is all about forecasting and prediction. For many years

    now, earthquake prediction has remained an awkward topicin polite seismological company, primarily because it has beendefined in the public mind by something we cannot do, whichis to predict with high probability the regional occurrence oflarge earthquakes over the short term. Yet the P-word is toocentral to our science to be banned from our working vocabu-lary. From a practical perspective, we must be able to predictearthquake hazards in order to lower seismic risk. From the

    basic-research perspective of system sci-ence, testing a models predictions againstnew data is the principle means by whichwe can gain confidence in the hypothesesand theories on which the model is built.

    For example, many interesting prob-lems of contingent predictability can beposed as physics questions in a system-spe-cific context. What will be the shaking in-

    tensity in the Los Angeles basin from a magnitude 7.8earthquake on the southern San Andreas fault? By how muchwill the strong shaking be amplified by the coupling of sourcedirectivity to basin effects? Will deep injection of waste fluidscause felt earthquakes near a newly drilled well in Oklahoma?How intense will the shaking be during the next minute of anongoing earthquake in Seattle? SSA should stake its claim asthe central forum for the physics-based study of earthquakepredictability, and its publications should be the place whereprogress in understanding predictability is most rigorously doc-umented.

    My second point is that forecasting and prediction are allabout probabilities. The deep uncertainties intrinsic to earth-quake forecasting are most coherently expressed in terms oftwo distinct types of probability: the aleatory variability thatdescribes the randomness of the system, and the epistemic un-certainty that characterizes our lack of knowledge about thesystem. In UCERF3, the former is cast as the time-dependentprobabilities of fault ruptures, of which there are over 250,000,whereas the latter is expressed as a logic tree with 5760 alter-native branches. Similarly, CyberShake represents the aleatoryvariability in wave excitation through conditional hypocenterdistributions and conditional slip distributions, and it charac-terizes the epistemic uncertainty in the wavefield calculationsin terms of alternative 3D seismic-velocity models.

    The full-3D treatment of seismic-wave propagation hasthe potential to improve our PSHA models considerably. Avariance-decomposition analysis of the recent CyberShake re-sults indicates that more accurate earthquake simulations couldreduce the aleatory variance of the strong-motion predictions

    System science offers abrick-by-brick approach tobuilding up ourunderstanding ofearthquake predictability.

    doi: 10.1785/0220140088 Seismological Research Letters Volume 85, Number 4 July/August 2014 767

  • by at least a factor of 2 relative to the empirical ground-motionprediction equations in current use; other factors being equal,this would the lower exceedance probabilities at high-hazardlevels by an order of magnitude. The practical ramificationsof this probability gain for the formulation of risk-reductionstrategies could be substantial.

    The coherent representation of aleatory variability andepistemic uncertainty in physics-based hazard models involvesmassive forward and inverse calculations, typically requiringvery large ensembles of deterministic simulations. For example,a CyberShake hazard model for the Los Angeles region involvesthe computation of about 240 million synthetic seismograms.These calculations have been made feasible by the developmentof clever algorithms based on seismic reciprocity and highlyoptimized anelastic wave propagation codes, but they stillstrain the capabilities of the worlds fastest supercomputers,which are currently operating at petascale (1015 floatingpoint operations per second).

    It is important to realize that our communitys needs forcomputation are growing more rapidly than our nations super-computer resources. In this year alone, for example, SCECsimulations will consume almost 200 million core-hours onNational Science Foundation (NSF) supercomputers such asBlue Waters and Department of Energy (DOE) supercom-puters such as Titan. As we move towards exascale computing,the machine architectures will become more heterogeneousand difficult to code, and the workflows will increase in com-plexity. To an ever-increasing degree, progress in earthquakesystem science will depend on deep, sustained collaborationsamong the seismologists and computational scientists focusedon extreme-scale computing. SSA should think carefully abouthow to accommodate such interdisciplinarycollaborations into its structure, and it willneed to work with NSF, DOE, and othergovernment agencies to make sure our com-putational capabilities are sufficient for thedemands of physics-based PSHA.

    PSHA occupies a central position in theuniverse of seismic-risk reduction. However,recent earthquake disasters have reinvigo-rated a long-standing debate about PSHA methodology. Manypractical deficiencies have been noted, not the least of which isthe paucity of data for retrospective calibration and prospectivetesting of long-term PSHA models. But some critics have raisedthe more fundamental question of whether PSHA is misguidedbecause it cannot capture the aleatory variability of large-mag-nitude earthquakes produced by complex fault systems. More-over, the pervasive role of subjective probabilities and expertopinion in specifying the epistemic uncertainties in PSHA hasmade this methodology a target for scientists who adhere to astrictly frequentist view of probabilities. According to some ofthese critics, PSHA should be replaced by neodeterministichazard estimates based on a maximum credible earthquake.

    As Warner Marzocchi pointed out in an Eos article lastJuly, neodeterministic SHA is not an adequate replacementfor probabilistic SHA. The choice of a maximum credible

    earthquake requires uncertain assumptions, such as choosinga return period, which essentially fix the level of acceptable risk.This black-and-white approach is fundamentally flawed be-cause it conflates the role of scientific advisor with that of adecision maker, mixing scientific judgments with political andeconomic choices that lie outside the domain of science. Fullyprobabilistic descriptions, such as those given by PSHA, areneeded for two reasons: first, to avoid unintended and oftenuninformed decision making in the tendering of scientific fore-casts, and second, to provide decision makers, including thepublic, with a complete rendering of the scientific informationthey need to balance the costs and benefits of risk-mitigationactions.

    We may never be able to predict the impending occur-rence of extreme earthquakes with any certainty, but we doknow that earthquakes cluster in space and time, and thatearthquake probabilities can locally increase by a thousand-foldduring episodes of seismicity. The lessons of LAquila andChristchurch make clear that this information must be deliv-ered to the public quickly, transparently, authoritatively, and ona continuing basis. Systems for this type of operational earth-quake forecasting (OEF) are being developed in several coun-tries, including Italy, New Zealand, and the United States, andthey raise many questions about how to inform decision mak-ing in situations where probability for a significant earthquakemay go way up in a relative sense but still remain very low(

  • have validated the probability gains of short-term forecastingmodels that are being used, or will be used, in OEF. Moreover,the Collaboratory is capable of supporting OEF and EEW byproviding an environment for the continual testing of opera-tional models against alternatives. However, U.S. participationin CSEP has thus far been primarily funded by a private organi-zation, the W. M. Keck Foundation, and stable support for itslong-term mission is not guaranteed.

    Of course, extreme earthquakes are very rare, so it will be awhile before enough instrumental data have accumulated toproperly test our long-term forecasts. However, as Dave Jack-son argued in a paper presented at this meeting, the earthquakehiatus in California suggests the current UCERF model inad-equately represents the large-scale interactions that are modu-lating the earthquake activity of the San Andreas fault system.Use of paleoseismology to extend the earthquake record backinto geologic time is a clear priority. The SSA should be thehome for this type of historical geophysics.

    It is also urgent that we increase the spatial scope of ourresearch to compensate for our lack of time. One goal of SSAshould be to join forces with CSEP and other internationalefforts, such as the Global Earthquake Model (GEM) project,in fostering comparative studies of fault systems around theworld. The issue is not whether to focus on the predictionproblems of earthquake system science, but how to accomplishthis research in a socially responsible way according to the mostrigorous scientific standards.

    I call upon a new generation of seismologiststhe stu-dents and early-career scientists in this roomto take onthe challenges of earthquake system science. You are fortunateto be in a field where the basic prediction problems remainmostly unsolved and major discoveries are still possible. Youare also fortunate to have access to vast new datasets and tre-mendous computational capabilities for attacking these prob-lems. System-level models, such as those I have described here,will no doubt become powerful devices in your scientificarsenal.

    However, these models can be big and unwieldy, requiringa scale of expertise and financial resources that are rarely avail-able to one scientist or a small research group. This raises anumber of issues about how to organize the interdisciplinary,multi-institutional efforts needed to develop these models. Inparticular, all of us at SSA need to make sure that any researchstructure dominated by earthquake system science allows you,as the rising leaders in this field, to develop new ideas abouthow earthquake systems actually work.

    Thomas H. JordanSouthern California Earthquake Center

    University of Southern CaliforniaLos Angeles, California 90089-0742 U.S.A.

    [email protected]

    Seismological Research Letters Volume 85, Number 4 July/August 2014 769