You are on page 1of 43

Quantifying Uncertainty in Simulations of Complex

Engineered Systems
Robert D. Moser
Center for Predictive Engineering & Computational Science (PECOS)
Institute for Computational Engineering and Sciences (ICES)
The University of Texas at Austin

September 20, 2011

Special thanks to: Todd Oliver, Onkar Sahni, Gabriel Terejanu


PECOS

Acknowledgment: This material is based on work supported by the Department of Energy [National Nuclear
Security Administration] under Award Number [DE-FC52-08NA28615].

R. D. Moser

Quantifying Uncertainty in Simulations

1 / 43

Motivation

PECOS: a DoE PSAAP Center


Develop Validation & UQ Methodologies for Reentry Vehicles
Multi-physics, multi-scale physical modeling challenges
Numerous uncertain parameters
Models are not always reliable (e.g. turbulence)

R. D. Moser

Quantifying Uncertainty in Simulations

2 / 43

Motivation

Prediction
Prediction is very difficult, especially if its about the future. N. Bohr

Predicting the behavior of the physical world is central to both science


and engineering
Advances in computer simulation has led to prediction of more
complicated physical phenomena
The complexity of recent simulations . . .
I Makes reliability difficult to assess
I Increases the danger of drawing false conclusions from inaccurate
predictions

R. D. Moser

Quantifying Uncertainty in Simulations

3 / 43

Motivation

Imperfect Paths to Knowledge and Predictive Simulation


DECISION
THE UNIVERSE
of
PHYSICAL
REALITIES

Observational
Errors

KNOWLEDGE

Modeling
Errors

Discretization
Errors

THEORY /
OBSERVATIONS

COMPUTATIONAL

MATHEMATICAL

MODELS

MODELS
VALIDATION

VERIFICATION

Predictive Simulation: the treatment of model and data uncertainties and their
propagation through a computational model to produce predictions of quantities
of interest with quantified uncertainty.
R. D. Moser

Quantifying Uncertainty in Simulations

4 / 43

Motivation

Quantities of Interest
Simulations have a purpose: to inform a decision-making process
Quantities are predicted to inform the decision
These are the Quantities of Interest (QoIs)
Models are not (evaluated as) scientific theories
I Involve approximations, empiricisms, guesses . . . (modeling
assumptions)
I Generally embedded in an accepted theoretical framework (e.g.
conservation of mass, momentum & energy)

Acceptance of a model is conditional on:


its purpose
the QoIs to be predicted
the required accuracy

R. D. Moser

Quantifying Uncertainty in Simulations

5 / 43

Motivation

What are Predictions?


Prediction
Purpose of predictive simulation is to predict QoIs for which
measurements are not available (otherwise predictions not needed)

Measurements may be unavailable because:


instruments unavailable
scenarios of interest inaccessible
system not yet built
ethical or legal restrictions
its the future

How can we have confidence in the predictions?


R. D. Moser

Quantifying Uncertainty in Simulations

6 / 43

Motivation

Sir Karl Popper


The Principle of Falsification
A hypothesis can be accepted as
a legitimate scientific theory
if it can possibly be refuted by
observational evidence
A theory can never be validated; it can
only be invalidated by (contradictory)
experimental evidence.
Corroboration of a theory (survival of
many attempts to falsify) does not
mean a theory is likely to be true.

R. D. Moser

Quantifying Uncertainty in Simulations

7 / 43

Motivation

Willard V. Quine

The falsification of an individual


proposition by observations is not
possible, as such observations rely on
numerous auxiliary hypotheses.
Only the complete theory (including all
auxiliary hypotheses) can be falsified by
an experiment.

R. D. Moser

Quantifying Uncertainty in Simulations

8 / 43

Motivation

Rev. Thomas Bayes

P (|D) =

P (D|)P ()
P (D)

Genesis of Bayesian interpretation of


probability & Bayesian statistics

Theories have to be judged in terms of


their probabilities in light of the evidence.

R. D. Moser

Quantifying Uncertainty in Simulations

9 / 43

Principles of Validation & Uncertainty Quantification

Posing a Validation Process


Validation of physical models, uncertainty models & their calibration
Two types of validation question:
I Are their unanticipated phenomena affecting the system
I Do modeling assumptions yield acceptable prediction of the QoIs
Challenge the model with validation data (generally not of QoIs)
I Use observations that challenge modeling assumptions
I Use observations that are informative for the QoIs
Are discrepancies between model & data significant?
I Can they be explained by plausible errors due to modeling
assumptions?
I If so, what is their impact on the prediction of the QoIs?

Validation expectations are model dependent


Interpolation models: simple fit to data
Physics-based models: formulated from theory extrapolatable
R. D. Moser

Quantifying Uncertainty in Simulations

10 / 43

Principles of Validation & Uncertainty Quantification

Uncertainty
Need to Treat Uncertainty in these Processes
Mathematical representation of uncertainty (Bayesian probability)
Uncertainty models
Probabilistic calibration & validation processes (Bayesian inference)

Modeling Uncertainty
Uncertainty in data
I instrument error & noise
I inadequacy of instrument models (a la Quine)
Uncertainty due to model inadequacy
I Represent errors introduced by modeling assumptions.
I Impact of these errors within the accepted theoretical framework

R. D. Moser

Quantifying Uncertainty in Simulations

11 / 43

Principles of Validation & Uncertainty Quantification

Stochastic Extension of Physical Models


Physics
Mathematical representation of physical phenomena of interest
At macroscale, usually deterministic (e.g., RANS)

Experimental Uncertainty
Model for uncertainty introduced by imperfections in observations used to
set model parameters (calibrate)

Model Uncertainty
Model for uncertainty introduced by imperfections in physical model

Prior Information
Any relevant information not encoded in above models
R. D. Moser

Quantifying Uncertainty in Simulations

12 / 43

Principles of Validation & Uncertainty Quantification

Model Likelihood

Z
p(D|) =

p(Dtrue |)
| {z }

p(D|Dtrue , )
{z
}
|

dDtrue

Experimental uncertainty Prediction model

Z
p(Dtrue |) =

p(Dtrue |Dphys , ) p(Dphys |) dDphys


{z
} | {z }
|
Model uncertainty

Physical model

Physics + model uncertainty = Prediction model


Prediction model + experimental uncertainty = Likelihood
Different/further decomposition possible depending on available
information
Models coupled with prior form a stochastic model class

R. D. Moser

Quantifying Uncertainty in Simulations

13 / 43

Principles of Validation & Uncertainty Quantification

UQ Using Stochastic Model Classes


Processes
Single Model Class, M
I Calibration: p(|D) p() p(D|)
R
I Prediction: p(q|D) =
p(q|, D) p(|D) d
I Experimental design
Multiple Model Classes, M = {M1 , . . . , MN }
I Calibration, prediction, and experimental design with each model class
I Model comparison/selection: P (M |D, M) P (M |M) p(D|M )
iP
i
i
I Prediction averaging: p(q|D, M) =
i p(q|D, Mi )P (Mi |D, M)

Software used at PECOS


DAKOTA: Forward propagation
QUESO: Calibration, model comparison
I Metropolis-Hastings, DRAM, Adaptive Multi-Level Sampling

R. D. Moser

Quantifying Uncertainty in Simulations

14 / 43

Principles of Validation & Uncertainty Quantification

Information Theoretic Interpretation


Rearranging Bayes formula:

P (d|M1 ) =

P (|M1 ) P (d|, M1 )
= P (d|, M1 )
P (|d, M1 )

Log and integrating:

P (|d, M1 )
P (|M1 )

ln[P (d|M1 )] P (|d, M1 ) d = . . .


|
{z
}
R
=1

Then:



P (|d, M1 )
E ln
P (|M1 )
|
{z
}

ln[P (d|M1 )] = E [ln[P (d|, M1 )]]


|
{z
}
|
{z
}
log evidence

R. D. Moser

how well the model


class fits the data

how much the model


class learns with the data
(expected information gain,EIG)

Quantifying Uncertainty in Simulations

15 / 43

Principles of Validation & Uncertainty Quantification

Summary: Four Stage Bayesian Framework


Stochastic Model Development
Generate extension of physical model to enable probabilistic analysis
Closure parameters viewed as random variables
Stochastic representations of model and experimental errors

Calibration
Bayesian update for parameters: p(|D) p()L(; D)

Prediction
Forward propagation of uncertainty using stochastic model

Model Comparison
Bayesian update for plausibility: P (Mj |D, M) P (Mj |M)E(Mj ; D)
R. D. Moser

Quantifying Uncertainty in Simulations

16 / 43

Data Reduction Modeling

Data Reduction Modeling


Why is it needed?
Very likely that quantities we wish to measure are not directly
measurable in an experiment
Have to infer the values from other measurements using a
mathematical model
Estimate/recover uncertainties in legacy experimental data

Impact on Validation and UQ


All mathematical models must be validated
Must incorporate uncertainty of both the measurements and the data
reduction model into the final uncertainty quantification of the data

R. D. Moser

Quantifying Uncertainty in Simulations

17 / 43

Data Reduction Modeling

Data Reduction Modeling


Traditional calibration schematic:
Data

Inversion
Process

Model

R. D. Moser

Validation
Process

Quantifying Uncertainty in Simulations

18 / 43

Data Reduction Modeling

Data Reduction Modeling


Incorporation of data reduction model:

DATA

Calibration

Instrument

Data Reduction
Model

Validation

Calibration

Model

R. D. Moser

Quantifying Uncertainty in Simulations

Validation

19 / 43

Data Reduction Modeling

DRM Example: Surface/Wall Catalysis


Motivation
Surface/wall catalysis plays a critical role in surface heat flux on a
re-entry vehicle
Reported estimates of surface reaction efficiency vary by few orders
of magnitude1 (often supercatalytic wall is assumed2 )

1
2

Zhang et. al. AIAA-2009-4251


Wright et. al. AIAA-2004-2455
R. D. Moser

Quantifying Uncertainty in Simulations

20 / 43

Data Reduction Modeling

Experimental Setup1



Zhang et. al. AIAA-2009-4251


R. D. Moser

Quantifying Uncertainty in Simulations

21 / 43

Data Reduction Modeling

Plug Flow Reactor Model (1D) PFRM

d(vCN )/dx =
N vNth CN /def f 2kN N CN2 CN2
mC = ds tMC

R
Ls

Rs (x)dx

Two recombination mechanisms:


Recombination at tube surface:
N = N T nN
a

Recombination in gas-phase: kN N = AN N e(EN N /(RT ))

Gas-surface mechanism:
th C (x)/4
Reaction flux: Rs (x) = CN vN
N
R. D. Moser

Quantifying Uncertainty in Simulations

22 / 43

Data Reduction Modeling

Synthetic Data : Joint PDFs of Parameters

th
d(vCN )/dx = N T nN vN
CN /def f
(E a

2AN N e

NN

th
mC = ds tMC CN (
vN
/4)

R. D. Moser

/(RT ))

R Ls
0

2
CN
CN2

CN (x)dx

Quantifying Uncertainty in Simulations

23 / 43

Data Reduction Modeling

Observed Data (35 runs) : Posterior PDFs of Parameters


6
4
2
0
1

0
log10(CN/CN,0)

0.5

0.5

2
1

1
1

0
nNnN,0

1
0.5

0
1

th
d(vCN )/dx = N T nN vN
CN /def f
(E a

2AN N e

th

NN

mC = ds tMC CN (
vN /4)

/(RT ))

R Ls
0

0
2

0.5
0
log10(N/N,0)

CN CN2

CN (x)dx

1
2
0.8
0.6
0.4
0.2
0
1

Multiple model formulation: physics model & a stochastic extension for model
inadequacy (multiplicative error on m)

Physics model w/o inadequacy model has null plausibility


One or more modeling assumptions invalid
R. D. Moser

Quantifying Uncertainty in Simulations

24 / 43

Data Reduction Modeling

A Look Back at Modeling Assumptions

Assumptions:
no radial gradients in species concentration: CN (x)
bulk flow velocity: v(x)
effective diameter: def f
concentration at sample surface is same as bulk: CN,ws (x)

All these assumptions consequence of 1-D plug flow approximation


need at least 2-D (axisymmetric) model
R. D. Moser

Quantifying Uncertainty in Simulations

25 / 43

Model Inadequacy Modeling

Model Inadequacy Models

Stochastic model of errors in physical model


I Model form uncertainties
I Incomplete dependencies
I Poor functional forms
Important for at least two reasons:
I Invalidate physics model in Bayesian multi-model formulation
I Uncertainty in predictions due to model inadequacy
For prediction, extrapolate uncertainty model
I Uncertainty of unreliable modeled quantity, not model output
I Uncertainty model formulated consistent with physical characteristics
of modeled quantity

R. D. Moser

Quantifying Uncertainty in Simulations

26 / 43

Model Inadequacy Modeling

Example: RANS Turbulence Models


Motivation
Majority of engineering simulations of turbulent flows use
Reynolds-averaged Navier-Stokes (RANS) models
RANS models for RS well-known to be imperfect and unreliable
RS models embedded in reliable mean momentum equation
Uncertainty due to uncertain parameters
I Model parameters are not constants of nature
Uncertainty due to model inadequacy
I Closure models are not physical laws
Must quantify effects of these uncertainties on model predictions

Approach
Formulate stochastic models to represent uncertainty
Use Bayesian probabilistic approach to calibrate and compare models
R. D. Moser

Quantifying Uncertainty in Simulations

27 / 43

Model Inadequacy Modeling

Stochastic Model Overview


Models
Four competing RANS turbulence (physics) models
I Baldwin-Lomax; Spalart-Allmaras; Chien, low Re k -; Durbins v 2 -f
Five competing model uncertainty representations
I Denial model (no model inadequacy)
I Three velocity-based with different spatial correlation assumptions
I One Reynolds stress-based
Twenty total stochastic models

Calibration Data and Prediction QoI


Fully-developed, incompressible channel flow
Calibrate using DNS data for Re 1000, 2000 (Re u / )
Predict centerline velocity at Re = 5000

R. D. Moser

Quantifying Uncertainty in Simulations

28 / 43

Model Inadequacy Modeling

Physical Modeling Approach


RANS
Decompose flow into mean and fluctuating parts: u = u + u0 .
Average the Navier-Stokes equations. For channel flow,

d
d

1 d
u+
+ +
Re d


= 1,

u
where u
+ = u
/u , + = u0 v 0 /u2 , u2 = d
dy , = y/ .
u
Model Reynolds stress using eddy viscosity t d
dy
Make up or choose a turbulence model for t (Baldwin-Lomax,
Spalart-Allmaras, k -, k - , ...)

Key Point
Model inadequacy introduced by closure modeli.e., combination of eddy
viscosity assumption and model for t
R. D. Moser

Quantifying Uncertainty in Simulations

29 / 43

Model Inadequacy Modeling

Stochastic Models: Basic Ideas


Goal
Create model that produces distribution over mean velocity fields
Incorporate information from RANS solution
Model the fact that RANS solution represents incomplete knowledge

Simple Example
hui(y; , ) = u
(y; ) + (y; )
u
is the RANS mean velocity
 is a random field representing uncertainty due to RANS infidelity
hui is stochastic prediction of true mean velocity

Issues
Where to introduce uncertainty model representing model inadequacy
Details of that model (e.g., distribution for , dependence on scenario)
R. D. Moser

Quantifying Uncertainty in Simulations

30 / 43

Model Inadequacy Modeling

Velocity-Based Stochastic Models


Multiplicative Gaussian Error
hui+ (; , ) = (1 + (; ))
u+ (; )
where  is a zero-mean Gaussian random field.

Covariance Structures
Independent: cov((), ( 0 )) = 2 ( 0 )
0 )2
Correlated (homogeneous): cov((), ( 0 )) = 2 exp 12 (
`2
Correlated (inhomogeneous):


1/2


0)
( 0 )2
exp

cov((), ( 0 ))i = 2 `22`()`(


()+`2 ( 0 )
`2 ()+`2 ( 0 )

for < in
`in
`in
`in + `out
(

)
for
in out
`() =
in
out in

`out
for > out
R. D. Moser

Quantifying Uncertainty in Simulations

where

31 / 43

Model Inadequacy Modeling

Inhomogeneous Model Details

`in
`in +
`() =

`out

`out `in
out in

( in )

for < in
for in out
for > out

All length scales non-dimensionalized by channel height


Inner lengths scale with viscous length, not channel height
Rewrite inner variables using viscous scales:

`in = `+
in /Re

+
in = in
/Re

+
2
Length scales (`+
in , `out ), blend points (in , out ), and variance are
calibration parameters

R. D. Moser

Quantifying Uncertainty in Simulations

32 / 43

Model Inadequacy Modeling

Covariance Models
Inhomogeneous

1.03

1.03

1.02

1.02

1.01

1.01

Homogeneous

0.99

0.99

0.98

0.98

0.97
0

0.2

0.4

0.6

0.8

0.97
0

0.2

0.4

0.6

0.8

Inhomogeneous covariance enables better representation of two-layer


structure of channel flow

R. D. Moser

Quantifying Uncertainty in Simulations

33 / 43

Model Inadequacy Modeling

Reynolds Stress-Based Stochastic Models


Motivation
Structure of the RANS equations is not uncertain
Only the closure (i.e., Reynolds stress tensor field) is uncertain
Formulate to be more generally applicable

Additive Model
hu0 v 0 i+ (; , ) = T + (; ) (; ) where T + obtained by solving
RANS+turbulence model
Find hui by forward propagation through mean momentum

d
d

1 dhui+
+ hu0 v 0 i+
Re d


= 1,

 chosen to be zero-mean Gaussian random field with

cov((), ( 0 )) = kin (, 0 ) + kout (, 0 ),


R. D. Moser

Quantifying Uncertainty in Simulations

34 / 43

Model Inadequacy Modeling

Reynolds Stress-Based Stochastic Models


Velocity

Reynolds Stress

0.1
0.08

2
0.06
0.04

0.02
0

0.02

0.04
0.06

2
0.08
0.1
0

0.2

0.4

0.6

0.8

3
0

0.2

0.4

0.6

0.8

Large Reynolds stress uncertainty in inner layer

R. D. Moser

Quantifying Uncertainty in Simulations

35 / 43

Model Inadequacy Modeling

Results Overview

Calibration:
I Joint posterior PDFs for model parameters for each model class
Model comparison:
I Posterior plausibility for each stochastic model class
I Examine joint and conditional plausibilities
QoI Prediction:
I Compare predictions (PDF for QoI) of each model class

R. D. Moser

Quantifying Uncertainty in Simulations

36 / 43

Model Inadequacy Modeling

Sample Parameter Posterior PDFs


Spalart-Allmaras

Chien k -
1.5

2.5

2
0
0.5

1.5

0.5
0.5

cv1

1.5

p(k)

p()

6
1.5

1
0
0.5

1
1

1.5

2.5

0.5
0.5

1.5

2.5

2.5

p()

p(cv1)

4
3
2

2
1

1
0
0.5

cv1

1.5

0
0.5

1.5

Parameter joint posterior PDFs computed using Adaptive Multi-Level


Algorithm implemented in QUESO Library

R. D. Moser

Quantifying Uncertainty in Simulations

37 / 43

Model Inadequacy Modeling

Joint Model Plausibility

Denial
Independent
Homogeneous
Inhomogeneous
Reynolds Stress

Baldwin

Spalart

Chien

Durbin

0
0
0
0
0

0
0
0
0
1.36 103

0
0
0
0.995
0

0
0
0
3.24 103
0

Chien k - coupled with inhomogeneous velocity uncertainty model


strongly preferred

R. D. Moser

Quantifying Uncertainty in Simulations

38 / 43

Model Inadequacy Modeling

Uncertainty Model Plausibility


Conditioned on turbulence model, which uncertainty model is preferred?
Uncertainty Model
Denial
Independent
Homogeneous
Inhomogeneous
Reynolds Stress

Baldwin

Spalart

Chien

Durbin

0
0
0
1
0

0
0
0
6.69 103
0.993

0
0
0
1
0

0
0
0
0.9998
1.86 105

Observations
Data prefers the inhomogeneous correlation structure for all models
Makes sense given the two-layer structure of the mean velocity profile

R. D. Moser

Quantifying Uncertainty in Simulations

39 / 43

Model Inadequacy Modeling

Turbulence Model Plausibility

Conditioned on uncertainty model, which turbulence model is preferred?


Turb Model
Baldwin
Spalart
Chien
Durbin

Indep

Homog

Inhomog

Rey Stress

1
0
0
0

0.779
0
0
0.221

0
1.01 105
0.996
3.33 103

0
0.99995
1.01 105
4.11 105

Observation
Preferred turbulence model depends on uncertainty model

R. D. Moser

Quantifying Uncertainty in Simulations

40 / 43

Model Inadequacy Modeling

QoI Predictions
Uncertainty Model Comparison

Turbulence Model Comparison


1.8

1.8
1.6
1.4

IND
SE
VLSE
ARSM

1.6
1.4
1.2

p(q)

p(q)

1.2

0.8

1
0.8

0.6

0.6

0.4

0.4
0.2

0.2
0
24

BL
SA
Chien
v2f

24.5

25

25.5

26

26.5

27

27.5

28

0
24

24.5

q = <u>+(1)

25

25.5

26

26.5

27

27.5

28

q = <u>+(1)

Observations
Different stochastic model extensions lead to significantly different
uncertainty predictions
With same stochastic extension, turbulence models similar for this QoI
R. D. Moser

Quantifying Uncertainty in Simulations

41 / 43

Challenges

Some Important Challenges

Uncertainty modeling
I Data & model inadequacy models
I Correlation structure
Validation
I Posing appropriate validation questions
I Validating physics & uncertainty models
I Validating for use in prediction
Algorithms
I Usual problems: curse of dimensionality, expensive physics models
I With complex inadequacy models, likelihood evaluation is a
high-dimensional probability integral

R. D. Moser

Quantifying Uncertainty in Simulations

42 / 43

Challenges

Thank you!
Questions?

R. D. Moser

Quantifying Uncertainty in Simulations

43 / 43

You might also like