You are on page 1of 67

Modern Applications of Experimental

Uncertainty Analysis

Glenn Steele
Department of Mechanical Engineering
Mississippi State University

and

Hugh Coleman
Propulsion Research Center
Department of Mechanical and Aerospace
Engineering
University of Alabama in Huntsville

www.uncertainty-analysis.com
Outline of Presentation

History
Current Standards
Regression Uncertainty
Code Verification and Validation
Uncertainty in the Design Process
Methodology Used in Kline and McClintock

r = r(X 1, X2 ,..., XJ )

Total Uncertainty

2 2 2
r r r
Ur = U X1 + U X2 + ... + U X J
2

X1 X2 XJ
Methodology Used in PTC 19.1-1985

95% Confidence Uncertainty

[ ]
1
URSS = B 2 + (t S)2 2

99% Confidence Uncertainty

U ADD = B + t S
THE ISO GUM

The
de facto
international
standard
Methodology Used in GUM

r = r(X 1, X2 ,...., XJ )

Combined Standard Uncertainty


2
J
r J 1 J
r r
u c (r) = u (Xi ) + 2
2 2
u(X i , Xk )
i =1 Xi i =1 k = i + 1 Xi Xk

Expanded Uncertainty (At a Given % Confidence)

U % = k % u c (r)
The GUM expresses uncertainty estimates, u(Xi),
based on their source
TYPE A evaluation (of uncertainty) method
of evaluation of uncertainty by the
statistical analysis of series of observations.
TYPE B evaluation (of uncertainty) method
of evaluation of uncertainty by means
other than the statistical analysis of series of
observations.
Book Number D04598
Price $95.00
ASME Customer Service Dept
Box 2900
Fairfield NJ 07007-2900
800-843-2763
infocentral@asme.org
AIAA S-071A-1999

www.aiaa.org
Methodology Used in Engineering Standards

For
r = r(X 1 , X 2 ,...., X J )
then
2
J
r
S =
2
r S Xi
i = 1 Xi
and
2
J
r J 1 J
r r
B r =
2
B Xi + 2 B Xi Xk
i =1 Xi i =1 k =i + 1 Xi Xk
and
1
B r 2 2
U95 = 2 + Sr2
2
or

[ ]
1
U95 = B + P 2
r r
2 2

where

Pr = 2S r
SYSTEMATIC ERROR () AND SYSTEMATIC
UNCERTAINTY (B)
A useful approach to estimating the magnitude of a
systematic error is to assume that the systematic error
for a given case is a single realization drawn from some
statistical parent distribution of possible systematic
errors, as shown below:

If the parent distribution is Gaussian, then the systematic


uncertainty B corresponds to the 2S limit of a 95%
confidence interval.
In the 1985 version of the ASME Standard, the 1st edition
of Coleman & Steele, and the current AIAA Standard this
is called the Bias Limit (hence the symbol B).

Since the mid-1990s, it has been generally agreed to call


the 95% confidence limit BX the systematic
uncertainty. This is the usage in the 2nd edition of
Coleman & Steele and in the 1998 version of the ASME
Standard.

Thus, a 95% estimate Systematic Uncertainty, BX


a 68% estimate Standard deviation of bias error
distribution, SBX
and so for a Gaussian bias error distribution
SBX = B X/2
In keeping with the nomenclature of the ISO Guide, uc is
called the combined standard uncertainty. For the data
reduction expression
r = r (X1, X2 ,..., XJ )
uc is given by
J J -1 J

u i SB i + 2
2 2 2
=c i k SB ik
i=1 i = 1 k = i+ 1
J

+ i2 S i2
i=1
where
r
i =
xi
To obtain an uncertainty Ur (termed the
expanded uncertainty in the ISO Guide) at some
specified confidence level (95%, 99%, etc), the
combined standard uncertainty uc must be multiplied
by a coverage factor, k%,

U% = k % uc

The ISO Guide recommends that the appropriate value


for k% is the t value for the specified confidence level
at the degrees of freedom for the result, r.
The effective number of degrees of freedom r for
determining the t-value is given (approximately) by
the so-called Welch-Satterthwaite formula as
2
2
i S i i S B i ]
[
J
2 2 2
+
i=1
r =
( [( S ) ] [ ])
J
4
i i / S i + ( i S B i ) 4 / B i
i=1

where
Si = Ni - 1
-2
1 SB i
B i
2 SB i
If the large sample assumption is made so that t = 2,
then the 95% confidence expression for Ur becomes
J J -1 J

U = i (2 SB i ) + 2 ( ) SB ik
2 2 2 2
r i k 2
i=1 i=1 k =i+1

J

+ (2 Si )
2
i
2

i=1

Recalling the definition of systematic uncertainty, the


(2SBi) factors are equal to the 95% confidence
systematic uncertainties Bi.
The (2Si) factors correspond to the 95% confidence
random uncertainties Pi, when all Ni 10.
The 95% confidence random uncertainty interval
around a single reading of a variable X.
For the large sample case with r 9 and all Ni 10,
we can define the systematic uncertainty (bias limit)
of the result as
J J -1 J

B r = i B i + 2
2 2 2
i k B ik
i=1 i=1 k =i+1

and the random uncertainty (precision limit) of the


result as
J

P i Pi )
22 2
= r(
i=1

so that
22 2
= +
U Br Pr
r
EXAMPLE

Q = mc p ( T2 T1 )

2 2 2
Q Q
2
Q Q
UQ =
2
Bm + B c p + B T2 + B T1
m c p
T2 T1
2
Q Q
2
Q Q
+2 B T1T2 + Pm + Pc p
T1 T2 m c p

2 2
Q Q

+
PT2 + PT1
T2 T1
The GUM and Engineering Standards Uncertainty
Analysis Methodologies are Identical

GUM
Considers source of uncertainty Type A or
Type B uncertainties

PTC 19.1-1998, etc.


Considers effect of uncertainty on variable Xi -
Systematic or Random uncertainties
International Presentation of Uncertainty
Show source and effect for uncertainties
B X,A B X,B PX, A PX,B or SB X ,A SB X ,B S X,A S X,B
Uncertainties
and
Regressions
Introduction

When experimental information is


represented using a regression, what is
the uncertainty that should be associated
with the use of that regression?
(X3,Y3)
Y

Y(Xnew) Y = mX + c
(X2,Y2)

(X1,Y1)

Xnew X

Consider a 1st order least squares regression


Y = mX new + c

N N N
N X iYi X i Yi
m= i =1 i =1 i =1
2

( )
N N
N X i2 X i
i =1 i =1

(X )Y X ( X Y )
N N N N
2
i i i i i
c= i =1 i =1 i =1 i =1
2

( )
N N
N X Xi i
2

i =1 i=1
Classical Random Uncertainty
The statistic that defines the standard deviation for a
straight-line curvefit is the standard error of
regression defined as
1/ 2
1 N
SY = (Yi mX i c )2
N 2 i =1

For a value of Xnew, the (large sample) random


uncertainty associated with the Ynew obtained from
the curvefit is
1
2 1 (X X ) 2 2
PY = 2 SY +
S XX
N
where
2
N
N
Xi
S XX =
2 i =1
i =1
Xi
N
Key assumptions:
the random uncertainty in Y is constant over the range of the
curvefit
there is no random uncertainty in the X variable
there is no systematic uncertainty in either variable

In many (if not most) instances, these assumptions


do not hold and there is uncertainty in Ynew that is not
accounted for.
Consider the Range of Regression
Uncertainty Cases of Engineering Interest

Uncertainty of regression coefficients: Um, Uc


Uncertainty of Y value from regression: UY(Xnew)
some or all (Xi,Yi) data pairs from different
experiments
all (Xi,Yi) data pairs from same experiment
Xnew from same apparatus
Xnew from different apparatus
Xnew with no uncertainty
Regression variables as functions: (Xi ,Yi ) not
measured directly

A Comprehensive Methodology that Covers


All of the Cases:

Brown, Coleman, and Steele, A Methodology


for Determining the Experimental Uncertainty
in Regressions, J. of Fluids Engineering, Vol.
120, No. 3, 1998, pp. 445-456.

Presented in detail and with examples in


Chapter 7 of Coleman and Steele.
Methodology: Treat regression expression as
a data reduction equation
Y = Y ( X 1 ,..., X N ,Y1 ,..., Y N , X new )

(X ) Y X ( X Y )
N N N N N N N
N X iYi X i Yi i
2
i i i i
Y ( X new ) = i =1 i =1 i =1
2
X new + i =1 i =1 i =1 i =1
2

( ) ( )
N N N N
N X i2 X i N X i2 X i
i =1 i =1 i =1 i =1

and apply the uncertainty propagation


equations
2
r 2
J J 1 J
r r
B =
2
r B
i + 2 Bik
i =1 X i i =1 k = i +1 X i X k

2 J 1
J
r J
r r
Pr2 = i
P 2
+ 2 Pik
i=1 X i i=1 k = i+1 X i X k
Monte Carlo Simulations Performed
1st order regression coefficients
studied effect of sample size
1st order regression Y uncertainty
Polynomial regression Y uncertainty
Functions as Regression Variables
1st Order and Polynomial
Type of dominant uncertainty
Percent of reading type uncertainties
Percent of full scale type uncertainties

All Simulations Indicated This


Methodology Provides Appropriate
Uncertainty Intervals
(X3,Y3)
Y

Y(Xnew)
(X2,Y2)

(X1,Y1)

Xnew X

Y ( Xnew ) = m( X1 , X2 , X3 , Y1 , Y2 , Y3 ) Xnew + c( X1 , X2 , X3 , Y1 , Y2 , Y3 )
Y ( Xnew ) = m( X1 , X2 , X3 , Y1 , Y2 , Y3 ) Xnew + c( X1 , X2 , X3 , Y1 , Y2 , Y3 )

2 2
3
Y 2 3 Y 2
UY =
2
PYi + PX i
i =1 Yi i =1 X i
2
3
Y 2 31 3
Y Y
+ BYi + 2 BYiYk
i =1 Yi i =1 k =i +1 Yi Yk
2
3
Y 2 31 3
Y Y 3 3
Y Y
+
BX i + 2 BX i X k + 2 BX iYk
i =1 X i i =1 k =i +1 X i X k i =1 k =1 X i Yk
2 2
Y 2 Y 2 3
Y Y 3
Y Y
+ BXnew + PXnew + 2 BXnew X i + 2 BXnewYi
Xnew Xnew i =1 Xnew X i i =1 Xnew Yi

Note:
(1) that the first summation on the RHS produces an
identical PY estimate as the classical method, and
(2) that the derivatives with respect to Xi and Yi are
functions of Xnew
Reporting Uncertainty UY of Y Value from the
Regression

UY
UY(X)
Y(X)+UY(X)

Y(X)=mX+c

Y(X)-UY(X)

Since UY is a function of Xnew , would have to carry


along entire data set to calculate a value for UY (Xnew)
each time we wanted one!!!
38
SOLUTION:
Report the uncertainty as an equation

UY- regress = f(X)

developed by curvefitting a set of (X, UY) points


generated from the uncertainty propagation
expression, then combine that with the uncertainties
associated with Xnew to obtain overall uncertainty in
the Y from the regression:
2
Y 2
(UY )
2
=U 2
Yregress + [
B Xnew + PX2new ]
Xnew

Some subtleties associated with this are discussed in


detail in Chapter 7.
Detailed Example:
Using a venturi in an experiment to determine a
flow rate.

Pnew
Q = Cd (Re new )Kd 2
4
d
1
D

Perform a calibration and curvefit the data:

Cd (Re) = a 0 (X1,...YN ) + a 1(X1,...YN )Re


Substitute into the 1st equation and solve for
Q to obtain the equation used in the test:
That equation for Q is
2 Pnew
a 0 Kd 4
d
1
D
Q=
4a 1Kd Pnew
1 4
new d
1
D
and the expressions for the uncertainty in the value of
Q obtained from the equation are

UQ regress = 5 10 16 Re new
3
1 10 10 Re new
2
+ 2 10 5 Re new 0.1162
2
Q 2
U =U
2
Q
2
Q regress + PPnew
Pnew
0.75

0.70

0.65

0.60

UQ-Regress (gpm) 0.55

0.50

0.45

0.40

0.35

0.30
60,000 70,000 80,000 90,000 100,000 110,000 120,000 130,000 140,000 150,000 160,000
Reynolds Number

UQ regress = 5 10 16 Re new
3
1 10 10 Re new
2
+ 2 10 5 Re new 0.1162
2
Q 2
U =U
2
Q
2
Q regress + PPnew
Pnew
Uncertainty Analysis

and the

Verification and Validation


(V&V) of Simulations
Brief Overview

Coleman, H.W. and Stern, F., 1997, "Uncertainties in


CFD Code Validation," ASME J. Fluids Eng., Vol. 119,
pp. 795-803. (Also see Authors Closure, ASME J.
Fluids Eng., Vol. 120, September 1998, pp. 635-636.)

Fred Stern Tutorial III Thurs May 31 4-6 pm Code


Verification and Validation
Suppose we have a
simulation result. How
good is it? The V&V
process helps answer
that question.

Consider the comparison


between a simulation
result and experimental
data.
The uncertainties determine

(a) the scale at which meaningful comparisons can be made

(b) the lowest level at which validation is possible; i.e.,


the noise level

Thus, these uncertainties must be considered as part of the V&V


process.

r(X) + Ur(X)

r
Predicted r(X)

r(X) - Ur(X)

X
The V&V Process
Preparation: Specification of objectives, validation
variables, validation set points, validation levels required,
etc.

Verification:
Are the equations solved correctly?
grid convergence studies, etc

Validation:
Are the correct equations being solved?
comparison with benchmark experimental data

Documentation
Consider a
Validation
Comparison:

Experimental result, D

Comparison error, E

Simulation result, S

E = D - S = D - S

D error in data

S error in simulation
The simulation error S is composed of
errors SN due to the numerical solution of the equations
errors SPD due to the use of previous data (properties, etc.)
errors SMA due to modeling assumptions

S = SN + SPD + SMA

Therefore, the comparison error E can be written as


E = D - S = D - S
or
E = D - SN - SPD - SMA
Consider the error equation

E = D - SN - SPD - SMA

When we dont know the value of an error i, we estimate an


uncertainty interval Ui that bounds i

The uncertainty interval UE which bounds the comparison error


E is given by (assuming no correlations among the errors)

2 2
E 2 E 2
or U E2 = UD + US = UD +US
2 2

D S

U E2 = U D2 + U SN
2
+ U SPD
2
+ U SMA
2
E = D - SN - SPD - SMA

U E2 = U D2 + U SN
2
+ U SPD
2
+ U SMA
2

The interval UE bounds E with 95% confidence; however, in


reality we know of no a priori approach for estimating USMA, which
precludes making an estimate of UE . In fact, one objective of a
validation effort is to identify and estimate SMA or USMA.

So, we define a validation uncertainty UV given by

2 2 2 2 2 2
U = U E U SMA = U D + U SN + U SPD
The interval UV would contain E 95 times out of 100 if SMA were
V

zero. The verification process provides an estimate for USN .


2 2 2 2 2 2
U V
= U E U SMA = U D + U SN + U SPD

The validation uncertainty UV is the key metric in the validation


process. It is the noise level imposed by the uncertainties UD , USN ,
and USPD ; thus, it is the lowest level at which validation can be
achieved.

It can be argued that one cannot discriminate when |E| < UV ; that is,
one cannot evaluate the effectiveness of proposed model
improvements since changes in SMA cannot be distinguished. If |E| <
UV, then validation has been achieved at the UV level, which is the
best that can be done considering the existing uncertainties.

On the other hand, if |E| UV , then one could argue that probably
SMA E.
Another important metric is the required level of validation, Ureqd,
which is set by program objectives.

Thus, the three important quantities in evaluating the results of a


validation effort are E, UV , and Ureqd . In comparing E and UV and
Ureqd there are six combinations:

1. E < U V < U reqd


2. E < U reqd < U V
3. U reqd < E < U V
4. U V < E < U reqd
5. U V < U reqd < E
6. U reqd < U V < E
1. E < U V < U reqd

2. E < U reqd < U V

3. U reqd < E < U V

In cases 1, 2 and 3, |E| < UV ; validation is achieved at the UV


level; and E is below the noise level, so attempting to decrease the
error due to the modeling assumptions, SMA , is not feasible from an
uncertainty standpoint.

In case 1, validation has been achieved at a level below Ureqd, so


validation is successful from a programmatic standpoint.
4. U V < E < U reqd

5. U V < U reqd < E

6. U reqd < U V < E

In cases 4, 5 and 6, |E| > UV , so E is above the noise


level and using the sign and magnitude of E to estimate
SMA is feasible from an uncertainty standpoint. If |E| >>
UV , then E corresponds to SMA and the error from the
modeling assumptions is determined unambiguously.

In case 4, |E| < Ureqd , so validation is successful at


the |E| level from a programmatic standpoint.
Marine Propulsor Thrust
Coefficient Validation

D S E% UV % UD % U SN % U SN / U D
Kt 0.146 0.149 -2.1 3.2 2.0 2.5 1.3

CFDSHIP-IOWA code (Stern et al. (1996)); marine-propulsor


flow data (Chen, 1996; Jessup, 1994)
Ship Wave Profile Validation
Traditional Comparison
Ship Wave Profile Validation
Coleman-Stern Comparison
UNCERTAINTY IN THE DESIGN
PROCESS
With limited resources and time, where are
these resources and time best spent to
produce the optimum design?

Given the design process, available


resources, and available time, will the
design meet program goals?
Sample design process:
1. 1-D Meanline Code (Step 1)
2. 2-D/3-D Steady Codes (Step 2)
3. Baseline Design (Step 3)
4. 3-D Steady/Unsteady Codes (Step 4)
5. Design II (Step 5)
6. Cold-flow Testing/Code Validation (Step 6)
7. Design III (Step 7)
8. Prototype Manufacture (Step 8)
9. Hot-fire Testing (Step 9)
10. Final Design (Step 10 or n-2)
11. Final Product Manufacture (Step 11 or n-1)
12. Flight Test/Design Validation/Certification (Step 12 or n)
What is the Issue?
How good does the design have to be?
For a given design process, determine the overall
uncertainty in the design.
How do the individual steps in the design process
interact to produce the final design?
Determine how the uncertainty estimates for each step in
the design process propagate through the design
process to produce the overall uncertainty in the design.
What are the critical steps in the design process?
Use uncertainty techniques to identify the critical steps
and improve these steps to insure that the design meets
the program goals.
EXAMPLE
Uncertainty in Design Modeling
Pump head required is determined from the
model
Li
Npipes
f i + K i
P 8 2 Di
w p =

+ Zg + 2 Q


i =1
4
Di
where
0 . 25
fi = 2
5 . 74

log 10 + i
0.9
3 . 7Di
Re i
Example Oil Transfer System
Percentage
vi
sc

0
10
20
30
40
50
60
os
f m ity
od
el
K1
D
2
D L2
ro elt
ug a
hn Z
es
s
L1
K2
Variables D
1
D
3
L3
de K3
Uncertainty Percentage Contributions

ns
ity

You might also like