You are on page 1of 61

Seismic Hazard Evaluation and Beyond Design Basis Events Considering New Earthquake Science Information

Norm Abrahamson Adjunct Prof., Civil Eng., UC Berkeley & Chief Seismologist, Geosciences Dept, Pacific Gas & Electric

Reaction to 2011 Tohoku Eqk


For California, focus has been on the large magnitude of the Tohoku Eqk
Are the California nuclear plants designed for M9 earthquake If the Japanese can be surprised by a large magnitude earthquake, why do we think we wont be surprised too?

Data for Source Characterization


Instrumental Earthquakes
Very short observation periods (30-100 years)

Historical Earthquakes
Short observation periods (100-1000 years)

Geologic Data
Longer time periods
Slip-rates & Paleoseismic data (1,000-10,000 years)

Does not capture temporal variations

Geodetic Data
Current deformation rates Short observation period generally not robust other than for major faults

Example of SSC Uncertainty in Mmax(EUS-SSC, 2011)

Example of Hazard Uncertainty 5Hz, Switzerland (PEGASOS, 2004)

Large Uncertainty in PSHA Results


Is the uncertainty so large that the probabilities are not useful? Use of mean hazard does not fully account for the uncertainty Can we use bounding approach instead?

Use Bounding Approach?


Large epistemic (scientific) uncertainties in the source characterization for rare events
Current trend in earthquake science is to include larger Max magnitude values in the source characterization model

Can we use bounding values instead?


Select a conservative estimate of the maximum magnitude Leads to regulatory stability as earthquake science improves

Use Bounding Values for Magnitude?


Issues with Bounding Magnitudes
Structures are designed for ground motions not earthquake magnitudes Large aleatory variability in the ground motion for a given magnitude, distance, and site condition

Example of Aleatory Variability


2004 Parkfield Earthquake M=6, SS Adjusted to VS30=400 m/s

5-

Aleatory Ground Motion Variability


By tradition, select median or 84th percentile Maximum ground motion is much higher No statistical basis for truncating number of std dev (Strasser and Bommer, 2009)

Effects of Magnitude and Aleatory Variability

Dist = 20 km Strike-slip VS30=400 m/s (NGA models)

For nearby faults (California NPPs), even less magnitude dependence

Bounding Values on Ground Motion


DOE-PG&E cooperative program on extreme ground motions (2006-2010) Yucca Mtn application (Andrews et al , 2007)
Finite-Fault Simulation for M6.5 at short distances using worst-case conditions Non-linear source Non-linear wave propagation through rock

Results for hard rock site conditions


Bounding values are very high
Sa(5 Hz) = 12 g PGV = 400 cm/s

Too large to be of much practical use for most facilities

Design Ground Motions


Bounding Ground Motions
Too large for practical use Too rare to justify their use Need to select a design ground motion level that is smaller than the bounding values

Find a design ground motion level that leads to a structure that is safe enough
Chance of failure is small enough that we will are willing to accept it
Risk-Informed Approach Need to consider target range of acceptable risk

Back to PSHA
Bounding case is not practical
How far to back off from bounding ground motion?

Need to estimate risk or use a simplified approach to risk Risk studies require rate of initiating events
PSHA (probability of ground motions occurring at site) Cannot avoid the issue of large uncertainties in PSHA

Design Ground Motions Approaches


Risk Approach PSHA Approach
Simplified risk calculation considering only ground motion probabilities

Deterministic ground motion approach


Simplified PSHA considering a small number of appropriate earthquakes

One approach is not more conservative than the others

Design Ground Motions


Seismic Risk Approach
Consider a range of design/evaluation ground motions for the NPP Model the earthquakes and ground motions (PSHA), Model the structure response of NPP (fragility) for a given design
What is the capacity to withstand beyond design basis ground motions?

Compute the probability of failure (risk) for each ground motion level (and structure design)
Hazard/risk ratio

Select design ground motion that leads to acceptably low probability of failure

Design Ground Motions


Probabilistic hazard approach
Model the earthquakes and ground motions (PSHA) Select a ground motion level that has a small probability of being exceeded Assume that if the ground motion is rare enough (e.g. 1/10,000), then the probability of failure will be small enough
Assumes there is capacity to withstand beyond design basis ground motions

Design Ground Motions


Deterministic hazard approach
Select representative rare earthquakes
For faults, select large (not largest) magnitude at the shortest distance For areal source zones choose reasonable distance

Select a ground motion level


84th percentile typically used

Assume that the ground motion from this rare earthquake will lead to a probability of failure that will be small enough
Implicit assumption about the probability of the ground motion Implicit assumption on the capacity to withstand beyond design basis ground motions

All Three Methods Have Residual Risk


Not designed for bounding case Safe = very small residual risk
But not zero Up to regulator to decide what is very small

Beyond design basis events need to be considered at critical facilities


What do the extreme events look like? Mitigate the effects, if possible & practical

How Well Can Can Compute Hazard (and Risk)?


Epistemic Uncertainty
Less data/knowledge implies greater epistemic uncertainty In practice, this is often not the case Tend to consider only available (e.g. published) models More data/studies leads to more available models
Greater epistemic uncertainty included in PSHA

Characterization of Epistemic Uncertainty


Regions with sparse data
Tendency to underestimate epistemic With little data, use simple models Often assume that the simple model is correct with no uncertainty Need for minimum generic uncertainty for regions with sparse data

Regions with more data


Broader set of models More complete characterization of epistemic uncertainty Sometimes overestimates epistemic

Example of Common Underestimation of Epistemic Uncertainty in SSC


If no information on date of last earthquake, many studies assume Poisson This ignores the uncertainty for a renewal model

Treatment of Epistemic Uncertainty


Not addressed consistently around the world Important to adequately quantify the uncertainty in the seismic hazard

New Seismic Information


PG&E Approach to
Maintain staff of seismologists, geologists, geotechnical engineers, and structural engineers for seismic Responsible to keep up with new seismic information
Sources, ground motions, tsunami

Provide research funding to steer researchers to address topics of concern to Diablo Canyon
University researchers Government researchers (USGS) Consultants

Beyond Design Basis Ground Motions (Diablo Canyon Example)


Design Ground Motion 1977
Based on
M7.5 earthquake on Hosgri fault at 5 km distance Using 1970s ground motion models 84th percentile ground motion

Faults Near Diablo Canyon

How do we get beyond design basis ground motions?


What are the characteristics of beyond design basis ground motions? Are they just scaled versions of regular ground motions?

What Do Beyond Design Basis Ground Motions Look Like?


Rare ground motions are more peaked over a narrow frequency band than typical ground motions
Extending the conditional mean spectrum concept to conditional spectra (Jack Baker)

Other important features of beyond design basis ground motions need to be identified
Opportunity for using numerical simulations to generate the extreme ground motion time histories

Conditional Spectra Approach


Scenario time histories
Select large set of time histories within mag-dist range that is consistent with the deaggregation for 1E-4 UHS
Also require that they are consistent with expected spectral shape (more peaked for the rarer events)

Find a subset (50-100 unique time histories) with up to 10 scaled versions of each along with rates of occurrence that approximate the hazard over 1 Hz to 25 Hz and 1E-3 to 1E-7
Maintains appropriate of the peaks and troughs in the spectra

Diablo Canyon Application Scenario time histories


Unique sets: 71 Total number of time histories: 569 Each scaled time history has a rate of occurrence Hazard is computed directly from the time histories and rates

Selected Eqk and Number of Scenario Time Histories by Eqk


1953 Imperial Vly (1) 1979 Coyote Lake (2) 1979 Imperial Vly (4) 1980 Mammoth-1 (2) 1980 Irpinia (1) 1981 westmorland (1) 1983 Coalinga-05 (2) 1984 New Zealand (1) 1984 Morgan Hill (2) 1986 Chalfant Vly-02 (2) 1987 Whittier (6) 1987 Superstition Hills-02 (2) 1989 Loma Prieta (5) 1992 Landers (1) 1992 Big Bear (1) 1994 Northridge (3) 1995 Konzi-01 (1) 1999 Kocaeli (1) 1999 Chi-Chi (8) 1990 Upland (1) 1999 Chi-Chi-03 (3) 2000 Tottori (4) 2004 Parkfield (6) 1979 Monenegro (1) 2008 Wenchuan (1) 2007 Chuetsu-Oki (6) 2009 Darfield (2) 2010 Christchurch (1)

Scenario Time History Comparison to UHS for average Horiz


Horizontal
Target (dashed) vs Computed (solid)
10

1 Sa (g) 0.1 0.01 0.01 1.0E-02 2.5E-04 1.0E-06 0.1 Period (sec) 4.5E-03 2.0E-03 1.0E-04 1.00E-07 4.0E-05 1 1.0E-03 6.0E-06

Use of Ground Motion Data Recorded at NPP site


Three earthquakes recorded at Diablo Canyon
2003 Deer Canyon (M3.5 , 8 km) 2003 San Simeon (M6.5, 35 km) 2004 Parkfield (M6.0, 85 km)

DCPP is (CA) hard rock site


VS30=1200 m/s CA ground motion models are poorly constrained for this VS30 range

Use the recorded data to develop site-specific rock site factors for ground motion models

Deer Canyon Eqk (M3.5, Rhypo=8 km) Provides constraints on kappa

Compare model with data (Note: event terms also needed)

Deviation from Average VS30=1200, k=0.04 Site

Changing Earthquake Science Ground Motion Models (CA)


PEER NGA-west2 Major update of empirical GMPEs for active region
Hanging wall effects
Factors of 1.5 to 2 in high frequency on for HW sites

Regionalization
Attenuation at R> 70km, VS30 scaling

PEER NGA-east major update of GMPEs for stable continental regions


Kappa estimation and effects (with Swissnuclear)
Factors of 1.5 to 3 in the high frequency range (f>10 Hz) for rock sites

Changing Earthquake Science Ground Motion Models


Other PEER studies
Standard deviation
Single station sigma Single path sigma

Use of data recorded at DCPP site to refine GMPEs

Ground Motion Simulation & Validation (SCEC)


Extensive validation effort Clear trend to using finite-fault simulations

Limitations of Empirical-Based PSHA


Source Characterization
Empirical catalogs may never have long enough observation period to capture temporal variability

Ground Motion Models


Empirical ground motion models will suffer from lack of enough data from large magnitudes and short distance for decades to come

Physics-Based PSHA
Source Characterization
Replace historical earthquake catalog with simulated catalogs from earthquake generators
Span 1000s of years Evaluate temporal variability Where are we now in the earthquake cycle?

Ground Motion Characterization


Replace the empirical ground motion models with numerical simulations based on finite-fault simulations with region-specific wave propagation
For each magnitude, generate ground motions for 100s of earthquakes with well laid out station locations

Note: common in EUS but using point source models

Use of Physics-Based Hazard


Why are finite-fault simulation models not used more in engineering applications in the US? Currently, results are not robust
Different simulation methods validated against past earthquake recordings Give inconsistent results for forward modeling of future earthquakes

Need different methods to give consistent results before application


Others have selected a single simualtion method for stability

Simulation Methods Used in NGA


Three Kinematic Finite-Fault Simulations
Pacific Engineering and Analysis (Silva) University of Nevada, Reno (Zeng) URS (Somerville / Graves)

Each model was validated against empirical data


1979 Imperial Valley 1989 Northridge 1992 Landers 1994 Northridge 1995 Kobe 1999 Chi-chi

Multiple Realizations for each scenario, sampling range of source inputs


PEA: 15 or 30 realizations UNR:12 realizations URS: 12 realizations

Strike-Slip: M7.8, SS (T=1 sec)

Strike-Slip: M6.5, SS (T=1 sec)

Improved Validation Needed


Southern California Earthquake Center
Testing 6 finite-fault methods (kinematic) Complete validation in 2013

Part A: Comparison with past Earthquakes


20 active crustal region earthquakes 3 Eastern U.S. (Stable region) earthquakes Provides a check of how well the simulation method works compared to data

Part B: Comparison with Empirical GMPEs


2-4 scenarios (e.g M6.5, R15 km; M6.5, R40 km) Rock site conditions Provides a check on the methodology for generating the source parameters
Checks the average for the methodology, not the variability

Variability for Physics-Based Ground Motion


Aleatory variability of the source parameters for future earthquakes Given magnitude and rupture geometry What is range of hypocenters, slip distributions, rise times, rupture velocities, ? Not addressed by SCEC validation How is the joint distribution of these parameters calibrated? Currently, using the same limited empirical data Can use the single-path sigma from empirical models to calibrate source parameter variability

Regulatory and Utility Challenges


Large epistemic uncertainty in hazard estimates Significant improvements in earthquake science for ground motion models expected in short term Bounding approach is not practical How to make a decision in the face of large uncertainties and changing results?
Should decisions on design values, need for backfit, or shutting down plant be made on the mean risk (or mean hazard)?

Rethink Use of Mean Risk?


Consider using fractiles in place of mean
Mean risk is the best estimate of the risk, but mean may be of little value given large uncertainty

Concept of high confidence of low risk


Similar to HCLPF for margins What confidence is needed? Different for existing (backfit) versus new plant?

Regulatory Stability and Changing Earthquake Science Information


Regulatory stability can be achieved by freezing science or specifying a single model
Example: California regulator used a single ground motion model from 1981 to 2007.
Allowed consistent comparisons of seismic evaluations over the years Outdated model and greatly underestimates the uncertainty

Regulatory inaction by continuing to chase the latest science


Never get done

Regulatory Stability and Changing Earthquake Science Information


New Earthquake Science Information
Significant improvements/changes in 5 year time periods

Regulatory Stability
For sake of stability and to allow time to evaluate new models & data, appropriate to use a given set of earthquake science models for up to 10 years

Evaluating New Seismic Information


Comparison of an updated design spectra with the current design spectrum is not sufficient in most cases
Does not address beyond design basis ground motions

Need for the an evaluation of the risk of the power plant, not just a comparison of spectra
Consider the beyond design basis ground motions What are their characteristics? Is there a cost-effective way to mitigate effects of beyond design basis ground motions?

Current NRC Guidance for Tsunami


Probable Maximum Tsunami (PMT)
Use best available scientific information to arrive at a set of scenarios reasonably expected to affect the nuclear power plant site
Identify tsunami sources Conduct numerical simulation to estimate tsunami waves Much improved over previous methods based on historical observations (too short of time period)

Tsunami (cont)
Probable Maximum Tsunami (PMT)
For determination of the PMT, conservative values and ranges of source parameters should be specified. This ensures that the design bases of the nuclear power plant will not be exceeded.
Gives appearance of worst-case, but it is not worst-case Does not specifically address aleatory variability

Residual Risk from Tsunamis


Like ground motion simulations, tsunami numerical simulation methods will have limitations
Leads to aleatory variability about the simulated values

If included, this aleatory variability can dominate the hazard (similar to ground motion) Residual risk from above design basis tsunamis needs to be considered

Tsunami Hazard at CA NPPs


Tsunami Sources
Distant subduction zone earthquakes
1-2 m (sensitive to the magnitude)

Local offshore earthquakes


0.5 - 1 m

Local offshore landslides


4-8 m

Key Issues
What is the aleatory variability of the wave heights? What is the size and rate of offshore landslides?
Geologic-based rates and sea level changes

Tsunami Hazard at DCPP

Power block at 25 m Design Basis (1975) Snorkel Height

Engineering Issues for Tsunamis at NPPs


No ductility for flooding Need for lower probability levels for flooding events
Not 1/10,000 as used for ground motion

Goal should be for similar risk from different natural hazards, not similar hazard levels

Summary
If addressed comprehensively, there are large epistemic uncertainties in seismic hazard
Uncertainty in computed risks could span acceptably low risk and unacceptably high risk

Rapid grow of new earthquake information will lead to significant changes in hazard estimates
Short term: next 1-2 yrs (for ground motion models) Long term: next 5-10 yrs

A simple bounding approach to ground motions is not practical


Too high of ground motions to be practical Too rare to justify its use

Summary (cont)
Utilities and regulators need to address uncertainties in decision making
Use of the weighted average (mean) captures some effects of uncertainty, but only partly

In light of Fukushima, simple use of mean risk (or mean hazard) may not be adequate
What confidence do we want that the risks are acceptable low?

You might also like