You are on page 1of 61

State of Practice of Seismic Hazard Analysis: From the Good to the Bad

EERIDISTINGUISHEDLECTURESERIES2009

Norm Abrahamson, Seismologist


Pacific Gas & Electric Company

Seismic Hazard Analysis


Approaches to design ground motion
Deterministic Probabilistic (PSHA) Continuing debate in the literature about PSHA

Time Histories
Scaling Spectrum compatible

EERIDISTINGUISHEDLECTURESERIES2009

Seismic Hazard Approaches


Deterministic approach
Rare earthquake selected Median or 84th percentile ground motion

Probabilistic approach
Probability of ground motion selected
Return period defines rare

Performance approach
Probability of damage states of structure
Structural fragility needed

Risk approach
Probability of consequence
Loss of life Dollars
EERIDISTINGUISHEDLECTURESERIES2009

Deterministic vs Probabilistic
Deterministic
Consider of small number of scenarios (Mag, dist, number of standard deviation of ground motion) Choose the largest ground motion from cases considered

Probabilistic
Consider all possible scenarios (all mag, dist, and number of std dev) Compute the rate of each scenario Combine the rates of scenarios with ground motion above a threshold to determine probability of exceedance

EERIDISTINGUISHEDLECTURESERIES2009

Deterministic Approach

Select a specific magnitude and distance (location)


For dams, typically the worst-case earthquake (MCE)

Design for ground motion, not earthquakes


Ground motion has large variability for a given magnitude, distance, and site condition Key issue: What ground motion level do we select?
EERIDISTINGUISHEDLECTURESERIES2009

2004 Parkfield Near Fault PGA Values


0.23 0.37 0.30 >1 1.13 0.31 0.63 0.25 0.16 0.22 0.13 0.21 0.23 0.17 0.49 3 0.16 0.28 0.20 0.84 0.85 0.58 0.82 0.51 0.45 0.63 0.58 0.30 0.25 0.39

0.17 0.23 1.31 0.55 0.33 0.10 0.21

0.85 QuickTime and a Photo - JPEG decompressor 0.43 are needed to see this picture. 0.25

0.11

0.08

EERIDISTINGUISHEDLECTURESERIES2009

Worst-Case Ground Motion is Not Selected in Deterministic Approach


Combing largest earthquake with the worstcase ground motion is too unlikely a case
The occurrence of the maximum earthquake is rare, so it is not reasonable to use a worst-case ground motion for this earthquake Chose something smaller than the worst-case ground motion that is reasonable.

EERIDISTINGUISHEDLECTURESERIES2009

What is Reasonable
The same number of standard deviation of ground motion may not be reasonable for all sources
Median may be reasonable for low activity sources, but higher value may be needed for high activity sources

Need to consider both the rate of the earthquake and the chance of the ground motion

EERIDISTINGUISHEDLECTURESERIES2009

Components of PSHA
Source Characterization
Size, location, mechanism, and rates of earthquakes

Ground motion characterization


Ground motion for a given earthquake

Site Response
Amplification of ground motion at a site

Hazard Analysis
Hazard calculation Select representative scenarios
Earthquake scenario and ground motion

EERIDISTINGUISHEDLECTURESERIES2009

Selected Issues in Current Practice


(Less) Common Problems with current Practice Max magnitude VS30 Spatial smoothing of seismicity Double counting some aspects of ground motion variability Epistemic uncertainties
Mixing of epistemic and aleatory on the logic tree Underestimation of epistemic uncertainties Over-estimation of epistemic uncertainties

Hazard reports / hand off of information


UHS and Scenario Spectra

EERIDISTINGUISHEDLECTURESERIES2009

Common Misunderstandings
Distance Measures
Different distance metrics for ground motion models often used interchangeably
Rupture distance JB distance Rx (new for NGA models) Hypocentral distance Epicentral distance

Return Period and Recurrence Interval used interchangeably


Recurrence interval used for earthquakes Return period for ground motion at a site
EERIDISTINGUISHEDLECTURESERIES2009

Common Misunderstandings
Standard ground motion models thought to give the larger component
Most ground motion models give the average horizontal component
Average is more robust for regression Scale factors have been available to compute the larger component

Different definitions of what is the larger component


Larger for a random orientation Larger for all orientations Sa(T) corresponding to the larger PGA
Can be lower than the average!

EERIDISTINGUISHEDLECTURESERIES2009

Use and Misuse of VS30


VS30
Not the fundamental physical parameter For typical sites, VS30 correlated with deeper Vs profile
Most soil sites are in alluvial basins (deep soils) CA empirical based models not applicable to shallow soil sites

Proper Use
Clear hand-off between ground motion and site response
Consistent definition of rock

Use for deep soil sites that have typical profiles

Misuse
Replace site-specific analysis for any profile (not typical as contained in GM data base) Use ground motion with VS30 for shallow soil sites (CA models)
Need to select a deeper layer and conduct site response study Or use models with soil depth and VS30
EERIDISTINGUISHEDLECTURESERIES2009

Sloppy Use of Terms: Mmax


Most hazard reports list a maximum magnitude for each source
Has different meanings for different types of sources

Zones
Maximum magnitude, usually applied to exponential model

Faults
Mean magnitude for full rupture, usually applied to characteristic type models Allows for earthquake larger than Mmax Called mean characteristic earthquake

Issue
Some analyses use exp model for faults or characteristic models for regions Not clear how to interpret Mmax

Improve practice
Define both Mmax and Mchar in hazard reports

EERIDISTINGUISHEDLECTURESERIES2009

Terminology
Aleatory Variability (random)
Randomness in M, location, ground motion () Incorporated in hazard calculation directly Refined as knowledge improves

Epistemic Uncertainty (scientific)


Due to lack of information Incorporated in PSHA using logic trees (leads to alternative hazard curves) Reduced as knowledge improves

EERIDISTINGUISHEDLECTURESERIES2009

Aleatory and Epistemic


For mean hazard, not important to keep separate Good practice
Keep aleatory and epistemic separate
Not always easy

Allows identification of key uncertainties, guides additional studies, future research

Source characterization
Common to see some aleatory variability in logic tree (treated as epistemic uncertanity)
Rupture behavior (segmentation, clustering)

Ground motion characterization


Standard practice uses ergodic assumption
Some epistemic uncertainty is treated as aleatory variability
EERIDISTINGUISHEDLECTURESERIES2009

Example: Unknown Die


Observed outcome of four rolls of a die
3, 4, 4, 5

What is the model of the die?


Probabilities for future rolls (aleatory)

How well do we know the model of the die?


Develop alternative models (epistemic)

EERIDISTINGUISHEDLECTURESERIES2009

Unknown Die Example


Roll Model 1 Global Analog 1/6 1/6 1/6 1/6 1/6 1/6 0 Model 2 Region Specific 0 0 0.25 0.50 0.25 0 0 Model 3 Region Specific 0.05 0.09 0.18 0.36 0.18 0.09 0.05

1 2 3 4 5 6 7

EERIDISTINGUISHEDLECTURESERIES2009

Epistemic Uncertainty
Less data/knowledge implies greater epistemic uncertainty In practice, this is often not the case
Tend to consider only available (e.g. published) models More data/studies leads to more available models Greater epistemic uncertainty included in PSHA

EERIDISTINGUISHEDLECTURESERIES2009

Characterization of Epistemic Uncertainty


Regions with little data
Tendency to underestimate epistemic
With little data, use simple models
Often assume that the simple model is correct with no uncertainty

Regions with more data


Broader set of models More complete characterization of epistemic Sometimes overestimates epistemic

EERIDISTINGUISHEDLECTURESERIES2009

Underestimation of Epistemic Uncertainty

Standard Practice: If no data on time of last eqk, assume Poisson only Good Practice: Scale the Poisson rates to capture the range from the renewal model
EERIDISTINGUISHEDLECTURESERIES2009

Overestimate of Epistemic Uncertainty


Rate: Constrained by paleo earthquake recurrence 600 Yrs for full rupture Mean char mag=9.0 Alternative mag distributions considered as epistemic uncertainty exponential model brought along with low weight, but leads to overestimation of uncertainty
EERIDISTINGUISHEDLECTURESERIES2009

Epistemic Uncertainty
Good Practice
Consider alternative credible models
Use minimum uncertainty for regions with few available models

Check that observations are not inconsistent with each alternative model

Poor Practice
Models included because they were used in the past Trouble comes from applying models in ways not consistent with their original development
E.g. exponential model intended to fit observed rates of earthquakes, not to be scaled to fit paleo-seismic recurrence intervals
EERIDISTINGUISHEDLECTURESERIES2009

Ground Motion Models


Aleatory
Standard practice to use published standard deviations
Ergodic assumption - GM median and variability is the same for all data used in GM model
Standard deviation applies to a single site / single path

Epistemic
Standard practice to use alternative available models (median and standard deviation) Do the available models cover the epistemic uncertainty
Issue with use of NGA models
EERIDISTINGUISHEDLECTURESERIES2009

Problems with Current Practice


Major problems have been related to the ground motion variability
Ignoring the ground motion variability
Assumes =0 for ground motion Considers including ground motion as a conservative option This is simply wrong.

Applying severe truncation to the ground motion distribution


e.g. Distribution truncated at +1
Ground motions above 1 are considered unreasonable

No empirical basis for truncation at less than 3. Physical limits of material will truncate the distribution

EERIDISTINGUISHEDLECTURESERIES2009

Example of GM Variability

EERIDISTINGUISHEDLECTURESERIES2009

GM Variability Example

EERIDISTINGUISHEDLECTURESERIES2009

GM Truncation Effects (Bay Bridge)

EERIDISTINGUISHEDLECTURESERIES2009

2004 Parkfield
1 F F F

Peak Acceleration (g) - Ave Horizontal Comp

0.1

FF F F F FF F F F F F F F F F F F F FH F F F F 2 H F F H F F F F F F F F F F FH FF F F F F FF F F 2 F H F H H
Median (Vs=380) 16th Percentile - intraevent 84th Percentile intraevent

H HF F H F F

0.01

H 2 F

SHAKEMAP Stations NSMP Stations CSMIP Stations

0.001 0.1

10

H F F H 2 FF F H H F F H F F H H H HF F H H 2 H H HH H 2 HH H HH H H H H H H HH H H H H H H H H H H H H H H H H H H HH H H H H H H H H H H H H H H H 100

1000

Rupture Distance (km)


EERIDISTINGUISHEDLECTURESERIES2009

Ergodic Assumption
Trade space for time

EERIDISTINGUISHEDLECTURESERIES2009

Mixing epistemic and aleatory (in Aleatory)

EERIDISTINGUISHEDLECTURESERIES2009

Standard Deviations for LN PGA


Region Chen&Tsai (2002) Atkinson (2006) Morikawa et al (2008) Lin et al (2009) Taiwan Southern CA Japan Taiwan Total 0.73 0.71 0.78 0.73 0.62 Single Site 0.63 0.62

EERIDISTINGUISHEDLECTURESERIES2009

Single Ray Path

EERIDISTINGUISHEDLECTURESERIES2009

Standard Deviations for LN PGA


Region Total Single Site 0.63 0.62 0.41 0.36 Single Path and site

Chen&Ts ai (2002)

Taiwan

0.73 0.71 0.78

Atkinson Southern CA (2006) Japan Morikawa et al (2008) Lin et al Taiwan (2009)

0.73

0.62

0.37

EERIDISTINGUISHEDLECTURESERIES2009

Removing the Ergodic Assumption


Significant reduction in the aleatory variability of ground motion
40-50% reduction for single path - single site

EERIDISTINGUISHEDLECTURESERIES2009

Hazard Example

EERIDISTINGUISHEDLECTURESERIES2009

Die: combine rolls (ergodic)

EERIDISTINGUISHEDLECTURESERIES2009

Non-Ergodic: Reduced Aleatory

EERIDISTINGUISHEDLECTURESERIES2009

Removing the Ergodic Assumption


Penalty: must include increased epistemic uncertainty
Requries model for the median ground motion for a specific path and site Benefits come with constraints on the median
Data Numerical simulations

Current State of Practice


Most studies use ergodic assumption
Mean hazard is OK, given no site/path specific information

Some use of reduced standard deviations (reduced aleatory), but without the increased epistemic
Underestimates the mean hazard Bad practice

EERIDISTINGUISHEDLECTURESERIES2009

Non-Ergodic: Increased Epistemic

EERIDISTINGUISHEDLECTURESERIES2009

Standard Deviations for Surface Fault Rupture


Std Dev (log10) Global Model (ave D) Global Model Variability Along Strike Total Global Single Site 0.28 0.27 0.39 0.17

EERIDISTINGUISHEDLECTURESERIES2009

Removing the Ergodic Assumption


Single site aleatory variability
Much smaller than global variability

Value of even small number of sitespecific observations


N 0 1 2 3 Epistemic Std Dev In Median (log10) 0.35 0.17 0.12 0.10

EERIDISTINGUISHEDLECTURESERIES2009

Large Impacts on Hazard

EERIDISTINGUISHEDLECTURESERIES2009

Keeping Track of Epistemic and Aleatory If no new data


Broader fractiles No impact on mean hazard

Provides a framework for incorporation of new data as it becomes available


Identifies key sources of uncertainty
Candidates for additional studies

Shows clear benefits of collecting new data

EERIDISTINGUISHEDLECTURESERIES2009

Hazard Reports
Uniform Hazard Spectra
The UHS is an envelope of the spectra from a suite of earthquakes

Standard practice hazard report includes:


UHS at a range of return periods gives the level of the ground motion Deaggregation at several spectral periods for each return period identifies the controlling M,R

Good practice hazard report includes:


UHS Deaggregation Representative scenario spectra that make up the UHS.
Conditional Mean Spectra (CMS)

EERIDISTINGUISHEDLECTURESERIES2009

Crane Valley Dam Example


Controlling Scenarios from deaggregation For return period = 1500 years:
SA(T=0.2): M=5.5-6.0, R=20-30 km Sa(T=2): M=7.5-8.0, R=170 km

EERIDISTINGUISHEDLECTURESERIES2009

Scenario Ground Motions


(Baker and Cornell Approach: Conditional Mean Spectra) Find number of standard deviations needed to reach UHS Next, Construct the rest of the spectrum

EERIDISTINGUISHEDLECTURESERIES2009

Correlation of Epsilons
T=1.5 T=0.3

EERIDISTINGUISHEDLECTURESERIES2009

Correlation of Variability

Correlation decreases away from reference period Increase at short period results from nature of Sa
EERIDISTINGUISHEDLECTURESERIES2009

slope

Scenario Spectra for UHS


Develop a suite of deterministic scenarios that comprise the UHS Time histories should be matched to the scenarios individually, not to the entire UHS

EERIDISTINGUISHEDLECTURESERIES2009

Improvements to PHSA Practice


At the seismology/engineering interface, we need to pass spectra for realistic scenarios that correspond the hazard level
This will require suites of scenarios, even if there is a single controlling earthquake

The decision to envelope the scenarios to reduce the number of engineering analyses required should be made on the structural analysis side based on the structure, not on the hazard analysis side.

EERIDISTINGUISHEDLECTURESERIES2009

Spatial Smoothing of Seismicity


Zone boundaries
Based on tectonic regions Based on seismicity rates

Activity rate
Usually from observed seismicity

Smoothing Approaches
Uniform within a zone Zoneless, based on a smoothing distance

Key Issue
Smoothing for the Host zone (R<50 km) In most cases, too much smoothing is applied

Most PSHAs do not check amount of smoothing


Is it consistent with observations?
EERIDISTINGUISHEDLECTURESERIES2009

Example: Crane Vly Dam

San EERI DISTAndreas INGUISHEFlt DLECTURESERIES2009

Site-Specific Checks of Smoothing


Assume Poisson with uniform rate within Sierra Nevada zone
M>3, R<50, 24 years: expect 20 eqk

Observation
M>3, R<50 km: 40 earthquakes M>3, R<17 km: 0 earthquakes

Simple Tests
If Poisson, what is the chance of >=40 eqk
P= < 0.0001 For R<50 km region, Rate is too low
Too much smoothing

EERIDISTINGUISHEDLECTURESERIES2009

Check Smoothing Near Site


Simple Tests
If uniform rate within 50 km, what is chance of observing 0 out of 40 earthquakes within 17 km?
Prob = 0.007 Indicates rate is not uniform within 50 km radius
Too much smoothing

Alternative method to set rate for R<17 km region


No eqk observed What rate would lead to reasonable probability of producing the observation (no earthquakes)
P=0.5 , rate = 0.3 ave zone rate P=0.1 , rate = 1.0 ave zone rate

EERIDISTINGUISHEDLECTURESERIES2009

General Testing of Smoothing


Start with broad smoothing Compare the statistics of the observed spatial distribution with the spatial distribution from multiple realizations of te model
Nearest neighbor pdf Separation distance pdf

If rejected with high confidence (e.g. 95% or 99%) then reduce the smoothing and repeat In general, US practice leads to too much smoothing.
Standard practice does not apply checks of the smoothing Beginning to see checks in some PSHA studies

EERIDISTINGUISHEDLECTURESERIES2009

Double Counting of Ground Motion Variability

Site-specific site response


Compute soil amplification
Median amplification Variability of amplification

Double Counting Issue


Site response variability is already in the ground motion standard deviation for empirical model

EERIDISTINGUISHEDLECTURESERIES2009

Standard Deviation by VS30


0.8 0.7 0.6 0.5 0.4 0.3 T=0.2 sec 0.2 0.1 0 100 VS30 (m/s) T=1.0 sec

1000

2000

EERIDISTINGUISHEDLECTURESERIES2009

Approaches to Site Response Variability


Common Practice
Use the variability of the amplification and live with the over-estimation of the total variability Use only the median amplification and assume that the standard deviation used for the input rock motion is applicable to the soil

Changes to practice
Reduce the variability of the rock ground motion
Remove average variability for linear response
About 0.3 ln units

Use downhole observation (e.g. Japanese data) to estimate reduction


About 0.35 ln units

EERIDISTINGUISHEDLECTURESERIES2009

Double Counting of Ground Motion Variability

Time Histories
Scaled recordings include peak-to-trough variability

Double Counting Issue


Peak-to-trough variability is already in the ground motion standard deviation for empirical model Variability effects are in the UHS Use of spectrum compatible avoids the double counting
EERIDISTINGUISHEDLECTURESERIES2009

Summary
Large variation in the state of practice of seismic hazard analysis around the world
Poor to very good Significant misunderstandings of hazard basics remain

Testing of models for consistency with available data is beginning for source characterization Common mixing of aleatory variability and epistemic uncertainty make it difficult to assess the actual epistemic part
For sources, avoid modeling aleatory variability as branches on logic tree Move toward removing ergodic assumption for ground motion Good practice currently removes ergodic for fault rupture

Improved handoff of hazard information is beginning


Scenario spectra in addition to UHS
EERIDISTINGUISHEDLECTURESERIES2009

You might also like