You are on page 1of 31

Alternative Representations of

Uncertainty
Jon C. Helton & Cédric J. Sallaberry

1
Alternative Representations of Uncertainty

• Probability Theory
• Evidence Theory (Dempster-Shafer Theory)
• Possibility Theory
• Interval Analysis

2
Probability

• Formal definition of probability involves three components


– A set  that contains everything that could occur in the particular “universe” under
consideration
– A set  of subsets of  with the properties that (i) if ∈, then C ∈  and (ii) if {i}is a
countable collection of elements of , then Uii and  ii are elements of 
– A function p defined for elements of  such that (i) p()=1, (ii) if  ∈ , then 0≤p() ≤ 1,
and (iii) if {i} is a countable collection of disjoint elements of , then p(Ui)=Σip(i)
• Triple (, , p) is called a probability space
• Terminology
–  called the sample space or universal set
– Elements of  are called elementary events
– Elements of  are called events
– p called a probability measure

3
Evidence Theory: Definition of Evidence Space

• Formal definition an evidence theory representation of uncertainty


involves 3 components
– A set  that contains everything that could occur in the particular “universe” under
consideration
– A (countable) set  of subsets of 
– A function m defined for subsets  of  such that (i) m()>0 if  ∈ , (ii) m()=0 if  ∉ 
and (iii) Σ  ∈  m()=1
• Triple (, , m) is called an evidence space
• Terminology
–  called the sample space or universal set
– Elements of  are called elementary events
– Elements of  are called focal elements
– m called a basic probability assignment (BPA)
• Nature of m(): Amount of “likelihood” that is associated with  but
cannot be further partitioned to subsets of .
4
Evidence Theory: Representation of Uncertainty

• Representation of uncertainty
– Belief
– Plausibility

• Belief:
– Definition: Bel ( ) = ∑ m(  )
 ⊂
– Concept: Amount of “likelihood” that must be associated with .

• Plausibility:
– Definition: Pl ( ) = ∑ m(  )
 ∩ ≠ ∅
– Concept: Amount of “likelihood” that could potentially be associated with .

5
Evidence Theory: Simple Example

:  1 = [1,4], m(  1 ) = 1 / 5
:  2 = [3,7], m(  2 ) = 1 / 5
:  3 = [5,6], m(  3 ) = 1 / 5
:  4 = [5,10], m(  4 ) = 1 / 5
:  5 = [9,10], m(  5 ) = 1 / 5
:  = [2,8]
1 2 3 4 5 6 7 8 9 10

 = {x : x ∈ [1,10]}
 = { 1 ,  2 ,  3 ,  4 ,  5 }
Bel ( ) = ∑ m(  ) = m( 
 i ⊂
i 2 ) + m(  3 ) = 2 / 5

Pl ( ) = ∑ m(  ) = m( 
 i ∩ ≠ ∅
i 1 ) + m(  2 ) + m(  3 ) + m(  4 ) = 4 / 5
6
Evidence Theory: Properties

Bel ( ) + Pl ( C ) = 1
Bel ( ) + Bel ( C ) ≤ 1
Pl ( ) + Pl ( C ) ≥ 1
Bel ( ) ≤ Pl ( )

Contrast with Probability

p ( ) + p ( C ) = 1

7
Evidence Theory: Cumulative Representation
Analogous to CDF.
• Cumulative belief function (CBF) Plot of belief, plausibility of
being less than specified
• Cumulative plausibility function (CPF) values
Analogous to CCDF.
• Complementary cumulative belief function (CCBF) Plot of belief, plausibility of
• Complementary cumulative plausibility function (CCPF) being greater than specified
values

CBF, CCBF, CPF and CCPF for a variable v with values from the interval [1, 10] and each of the
following intervals assigned a BPA of 0.1:[1, 3], [1, 4], [1, 10], [2, 4], [2, 6], [5, 8], [5, 10], [7, 8],
[7, 10], [9, 10]. 8
Evidence Theory: Vector-valued Quantities

• x1 , x2 ,…, xn real-valued with evidence spaces (i,i,mi),


i=1,2,…,n
• Evidence space (,,m) for x=[x1 , x2 ,…, xn]
•  = 1 × 2 ×… × n
• ∈ iff  = 1 × 2 ×… × n for i∈i

=  1 ×  2 × ×  n ∈ 
m1 ( 1 ) m2 ( 2 )...mn ( n ) if �
• m( ) = 
0 otherwise

• Belief, plausibility defined same as in one variable case

9
Evidence Theory: Function with Uncertain Arguments

• Function f(x) with evidence space (,,mx) for x


• Resultant evidence space (,,my) for y=f(x)
•  = {y : y=f(x) , x ∈}
•  = { : =f(),  ∈ }
m (  ) = f (  ),  ∈ 
if �
• m y ( ) =  x
0 otherwise
• In concept, belief and plausibility defined from  and my
• In computational practice, belief and plausibility obtained by mapping back to
evidence space (,,mx) for x
– Bel y (  ) = Bel x ({x : f (x ) ∈  })
– Pl y (  ) = Pl x ({x : f (x ) ∈  })
• Cumulative and complementary cumulative results for y ∈
– CBF : [ y, Bel x ({x : y (x ) ≤ y})] , CPF : [ y, Pl x ({x : y (x ) ≤ y})]
– CCBF : [ y, Bel x ({x : y < y (x )})] , CCPF : [ y, Pl x ({x : y < y (x )})]

10
Evidence Theory: Example (1/3)

• Function f(x) =f(a, b)=(a+b)a, x=[a,b]


• Evidence space (,,mA ) for a
 = [0.1, 1.0],  = { 1 ,  2 ,  3 }, m A ( i ) = 1 / 3
 1 = [0.5, 1.0],  2 = [0.2, 0.7],  3 = [0.1, 0.6]
• Evidence space (,,mB) for b
 = [0.0, 1.0],  = { 1 ,  2 ,  3 ,  4 }, mB ( i ) = 1 / 4
 1 = [0.6],  2 = [0.4, 0.8],  3 = [0.1, 0.7],  4 = [0.0, 1.0]
• Evidence space (,,mX ) for x = [a, b]
 =  ×  ,  = { 1 ×  1 ,  1 ×  2 ,  ,  3 ×  4 }, m X ( i ×  j ) = (1 / 3)(1 / 4) = 1 / 12
• Probability space (,,pX ) for x = [a, b]: Uniform distribution on each rectangle i × j
weighted by 1/12

11
Evidence Theory: Example (2/3)

12
Evidence Theory: Example (3/3)

Pl (> y = 0.8) = 12(1 / 12) = 1


Bel (> y = 0.8) = 6(1 / 12) = 0.5
Pl (> y = 1.5) = 4(1 / 12) = 0.33
Bel (> y = 1.5) = 0(1 / 12) = 0.0

13
Possibility Theory: Definition of Possibility Space

• Formal definition a possibility theory representation of uncertainty


involves 2 components
– A set  that contains everything that could occur in the particular “universe” under
consideration
– A function r such that (i) 0 ≤ r(x) ≤ 1 for x ∈  and (ii) sup{ r(x): x ∈  } = 1
• Doublet (, r) is called an possibility space
• Terminology
–  called the sample space or universal set
– r is referred to a possibility distribution function
• Nature of r: Amount of “likelihood” or “credence” that can be assigned to
each element of . Analogous to membership value for elements of a
fuzzy set.

14
Possibility Theory: Representation of Uncertainty

• Representation of uncertainty
– Possibility
– Necessity

• Possibility:
– Definition: Pos ( ) = sup{r ( x) : x ∈  }

– Concept: Measure of amount of information that does not refute the proposition that 
contains the “correct” value for x.

• Necessity:
– Definition: {
Nec( ) = 1 − Pos (  ) = 1 − sup r ( x) : x ∈   }
– Concept: Measure of amount of uncontradicted information that supports the
proposition that  contains the “correct” value for x.
15
Possibility Theory: Simple Example

r(x) = -1/5 + x/5 r(x) = 5/2 - x/4

1 2 3 4 5 6 7 8 9 10
X

 = [1,10], r ( x) defined above


Pos([5,10]) = sup{r ( x) : x ∈ [5,10]} = 1
Nec([5,10]) = 1 − sup{r ( x) : x ∈ [1,5)} = 1 − 4 / 5 = 1 / 5
Pos([7,10]) = sup{r ( x) : x ∈ [7,10]} = 3 / 4
Nec([7,10]) = 1 − sup{r ( x) : x ∈ [1,7)} = 1 − 1 = 0
16
Possibility Theory: Properties

Nec( ) + Pos ( C ) = 1
Nec( ) + Nec( C ) ≤ 1
Pos ( ) + Pos ( C ) ≥ 1
Nec( ) ≤ Pos ( )

Contrast with Probability

p ( ) + p ( C ) = 1

17
Possibility Theory: Cumulative Representation
Analogous to CDF.
• Cumulative necessity function (CNF) Plot of necessity, possibility of
being less than specified
• Cumulative possibility function (CPoF) values
Analogous to CCDF.
• Complementary cumulative necessity function (CCNF) Plot of necessity, possibility of
• Complementary cumulative possibility function (CCPoF) being greater than specified
values

CNF, CCNF, CPoF and CCPoF for a variable v with values from the interval [1, 10] a possibility
distribution function rv defined as follows: rv(v) = i/5, for i = 1, 2, 3, 4, 5 and i ≤ v < i+1 and rv(v)
= (10-i)/4, for i = 6, 7, 8, 9, i ≤ v < i+1, and v ≤ i+1 used instead of v < i+1 for i = 9 18
Possibility Theory: Vector-valued Quantities

• x1 , x2 ,…, xn real-valued with possibility spaces (i, ri),


i=1,2,…,n
• Possibility space (, r) for x=[x1 , x2 ,…, xn]
•  = 1 × 2 ×… × n
• r(x) = min {r1(x1), r2(x2), …, rn(xn) } for x=[x1 , x2 ,…, xn]

• Necessity, possibility defined same as in one variable case

19
Possibility Theory: Function with Uncertain Arguments

• Function f(x) with evidence space (, rx) for x


• Resultant possibility space (, ry) for y=f(x)
•  = {y : y=f(x) , x ∈}
• ry(y) = sup{rx(x): y=f(x), x ∈ }
• In concept, necessity and possibility defined from  and ry
• In computational practice, necessity and possibility obtained by mapping back to
possibility space (, rx) for x
– Nec y (  ) = Nec x ({x : f (x ) ∈  })
– Pos y (  ) = Pos x ({x : f (x ) ∈  })
• Cumulative and complementary cumulative results for y ∈
– CNF : [ y, Nec x ({x : y (x ) ≤ y})] , CPoF : [ y, Pos x ({x : y (x ) ≤ y})]
– CCNF : [ y, Nec x ({x : y < y (x )})] , CCPoF : [ y, Pos x ({x : y < y (x )})]

20
Possibility Theory: Example (1/2)

• Function f(x) =f(a, b)=(a+b)a, x=[a,b]

• Possibility space (, rA ) for a


 = [0.1, 1.0],  1 = [0.5, 1.0],  2 = [0.2, 0.7],  3 = [0.1, 0.6]
3
1 if a ∈  i
rA (a ) = ∑ δ i (a ) / 3where δ i (a ) = 
i =1 0 otherwise
• Possibility space (, rB) for b
 = [0.0, 1.0],  1 = [0.6],  2 = [0.4, 0.8],  3 = [0.1, 0.7],  4 = [0.0, 1.0]
4
1 if b ∈  i
rB (b) = ∑ δ i (b) / 4 where δ i (b) = 
i =1 0 otherwise
• Possibility space (, rX ) for x = [a, b]

 =  ×  , rX ([a, b]) = min{rA (a ), rB (b)}


• Probability space (,,pX ) for x = [a, b]: Uniform distribution on each rectangle i × j
weighted by 1/12

21
Possibility Theory: Example (2/2)

Values of distribution function rx for possibility Estimated CCNF, CCDF and CCPoF for y = f (a, b)
space (,rx) (e.g. rx([a,b]) = 1/2 for 0.2 ≤ a ≤ 0.7 = (a + b )a
and 0.1 ≤ b ≤0.4
Pos (> y = 0.8) = 1
Nec(> y = 0.8) = 1 − Pos (≤ y = 0.8) = 1.0 − 0.5 = 0.5
Pos (> y = 1.5) = 0.33
Nec(> y = 1.5) = 1 − Pos(≤ y = 1.5) = 1.0 − 1.0 = 0.0 22
Interval Analysis

• Define range of values for x

• Determine resultant range of values for y = f(x)

• No uncertainty structure imposed on x, only range of values

• Different in spirit from probability theory, evidence theory and possibility


theory representation of uncertainty

• Corresponds to “degenerate” evidence theory and possibility theory


representation of uncertainty
– Evidence theory: sample space has BPA of 1
– Possibility theory: Possibility distribution function identically equal to 1
23
Notional Example: Only Epistemic Uncertainty (1/4)

• EN2: Model f(t|eM)=Q(t|eM) for closed electrical circuit


d 2Q dQ Q dQ
L 2 +R + = E0 exp(−λt ), Q(0) = 0, (0) = 0
dt dt C dt

where
Q(t ) = electrical charge (coulombs) at time t (s),
L = inductance (henrys),
R = resistance (ohms),
C = capacitance (farads),
E0 exp(−λt ) = electromotive force (volts),
dQ
= current(amperes).
dt

24
Notional Example: Only Epistemic Uncertainty (2/4)

• EN3: probability space (, , pEM) for epistemic uncertainty


eM=[eM1,eM2,eM3,eM4,eM5]=[L,R,C,E0,λ]
1 = {L: 0.8 ≤ L ≤ 1.2 henrys }, 2= {R: 50 ≤ R ≤ 100 ohms },
3= {C: 0.9x10-4 ≤ C ≤ 1.1x10-4 farads }, 4= {E0: 900 ≤ E0 ≤ 1100 volts },
5= {λ: 0.4 ≤ λ ≤ 0.8 s-1 },
 = 1 x 2 x 3 x 4 x 5 0 1 2 3 4 5 6 7 8
• Four subintervals are considered for each
of the intervals i, i=1,2,…,5, defined above
i1 :
i1=[ a , b – (b – a)/4 ], i2 :
i2=[ a + (b – a)/4, b ],
i3 :
i3 =[ a + (b – a)/8, b – 3(b – a)/8 ],
i4 :
i4 = [ a + 3(b – a)/8, b – (b – a)/8 ], Illustration of sets i1, i2, i3 and i4 defined
with the interval [a,b] normalized to the
interval [0,8] for representational simplicity
1 if eMi ∈  ij
[ ]
4
d i (eMi ) = ∑ δ ij (eMi ) 4 max( ij ) − min( ij ) with δ ij (eMi ) =  In effect, defines dEM(e) and pEM()
i =1 0 otherwise
Under the assumption that the four sources that provided the 25
intervals for an element eMi of eM are equally credible
Notional Example: Only Epistemic Uncertainty (3/4)

• Evidence theory representation


– For each uncertain variable, set i of possible values divided into subsets i1, i2, i3, i4 as
indicated on preceding slide with i =1, 2, 3, 4, 5 corresponding L,R,C,E0,λ, respectively
– i = {i1, i2, i3, i4 }

1 / 4 if  ∈ i
– BPA for subset  of i: mi ( ) = 
0 otherwise
– (i, i, mi) evidence space L,R,C,E0,λ
– Evidence space (, , m) for eM=[L,R,C,E0,λ] results as previously described
• Possibility space representation
– Sets i1, i2, i3, i4 same as above for i =1, 2, 3, 4, 5
4
1 / 4 if e ∈  ij
– For e ∈i, ri (e) = ∑ δ i (e) / 4 with δ i (e) = 
i =1 0 otherwise
– (i,ri) possibility spaces for L,R,C,E0,λ
– Possibility space (,r) for eM=[L,R,C,E0,λ ] results as previously described
• Interval analysis
–  set of possible values for eM=[L,R,C,E0,λ]
– No uncertainty structure assumed for 
26
Notional Example: Only Epistemic Uncertainty (4/4)

• Uncertainty propagated with random sample of size 105

0.20

0.15
Q ( t | a,eM )

0.10

0.05

0.00
0.00 0.05 0.10 0.15 0.20
t.: Time (s)

50 of 105 results

27
Notional Example: Aleatory and Epistemic Uncertainty
(1/3)

• Overview: Mechanical system receiving perturbations whose occurrence


follows a stationary Poisson process with each perturbation decaying
exponentially with time after its occurrence.
• EN1: probability space (, , pA) for aleatory uncertainty for time interval [0,
10 s]
– a = [n,t1,A1,t2,A2,…,tn,An]
where
n = number of perturbations in time interval [0,10 s]
ti = occurrence time for perturbation i with t1 < t2 < … < tn
Ai = amplitude of occurrence i
– Occurrence times characterized by a Poisson process with rate λ
– Amplitude A has triangular distribution on [a,b] with mode m
–  = {a: a = [n,t1,A1,t2,A2,…,tn,An]}
– λ and distribution for A in effect define (, , pA) and dA(a)
– λ, a, m, b epistemically uncertain dA(a|eA), eA=[λ, a, m, b]

28
Notional Example: Aleatory and Epistemic Uncertainty
(2/3)

• EN2: Model A(t|a,r) for accumulated perturbations at time t


0 for t < t1
 ~
A( t | a, r ) =  n
∑ Ak exp[− r (t − t k )] for t1 ≤ t and n~ = max{k : t k ≤ t}
 k =1
Where r = epistemically uncertain perturbation decay rate (i.e., eM = [r])
• EN3: Probability space (, , pE) for epistemic uncertainty
e = [eA,eM] = [λ, a, m, b, r] , eA = [λ, a, m, b], eM = [r]
1= {λ: 0.5 ≤ λ ≤ 1.5 s-1 }, 2= {a: 1.0 ≤ a ≤ 2.0 kg m/s2 },
3 = {m: 2.0 ≤ m ≤ 4.0 kg m/s2 }, 4= {b: 4.0 ≤ b ≤ 5.0 kg m/s2 },
1= {r: 0.2 ≤ r ≤ 1.2 s-1 },  = 1 x 2 x 3 x 4 x 1

• Probability space (, , pE), evidence space (, , mE), possibility space
(, rE) and set  for interval analysis defined in same manner as preceding
example

29
Notional Example: Aleatory and Epistemic Uncertainty
(3/3)
• Aleatory and epistemic uncertainty
• Random sample of size 105 from  = {e:e = [λ, a, m, b, r] }
• Random samples of size 10,000 from  = {a:a = [n, t1, A1, t2, A2, …, tn, An]}
conditional on each ei

0
10
Frame 3.12a

10 -1
pA [ A(10|a, r ) < A|eA ]

10 -2

-3
10

10 -4

0
0 10 20 30 40
A: A(10|a, r )

50 of 105 CCDFs

30
References

• Background references
1. Shafer G. A Mathematical Theory of Evidence. Princeton, NJ: Princeton Univ. Press 1976.

2. Klir GJ, Wierman MJ. Uncertainty-Based Information. New York, NY: Physica-Verlag 1999.

3. Klir GJ. Uncertainty and Information: Foundations of Generalized Information Theory. New York, NY:
Wiley-Interscience, 2006.

4. Dubois D, Prade H. Possibility Theory: An Approach to Computerized Processing of Uncertainty.


New York, NY: Plenum 1988.

5. Ross TJ. Fuzzy Logic with Engineering Applications, 2nd edn. New York, NY: Wiley, 2004.

6. Ross TJ, Booker JM, W.J. Parkinson (eds.). Fuzzy Logic and Probability Applications: Bridging the
Gap. Philadelphia, PA: Society for Industrial and Applied Mathematics, 2002.

• Attached references:
7. Helton JC, Johnson JD, Oberkampf WL. An Exploration of Alternative Approaches to the
Representation of Uncertainty in Model Predictions. Reliability Engineering and System Safety 2004; 85(1-
3):39-71.

8. Helton JC, Johnson JD, Oberkampf WL, Storlie CB. A Sampling-Based Computational Strategy for
the Representation of Epistemic Uncertainty in Model Predictions with Evidence Theory. Computational
Methods in Applied Mechanics and Engineering 2007; 196(37-40):3980-3998. 31

You might also like