Professional Documents
Culture Documents
Volume 4
I.
I NTRODUCTION
PROBABILITIES AND
A. Conditional probabilities
Let us consider two random events and . The conditional probability mass functions (pmfs) () and ()
are defined1 (assuming () > 0 and () > 0) by [20]:
()
( )
()
and ()
( )
()
(1)
II.
BAYES
FUSION
In this section, we recall the definition of conditional probability [20], [21] and present the principle and the properties of
195
() =
( = ) ( = )
(3)
=1
Advances and Applications of DSmT for Information Fusion. Collected Works. Volume 4
(1) :
(4)
(= )
(1 ) =
(1 2 ) =
(5)
( = 1 ) ( = 2 )
( = )
=1
(7)
=1
(1 )
()
(2 )
(= 1 )
(= )
()
(8)
(= 2 )
(= )
(1 ) (2 )
1 , 2 )
()
()
1
(
(9)
( = 1 ) ( = 2 )
( = )
( = )
=1
( = 1 1 ) ( = 2 2 )
( = 1 )
( = 2 )
( = 1 ) ( = 2 )
= (1 , 2 )
(
=
(
=
=1
(11)
(10)
196
2 ()
2
(12)
1 ,2 =11 =2
( = 1 1 ) ( = 2 2 )
(13)
( = 1 )
( = 2 )
Generalization to (1 2 . . . )
( )
(, 1 , . . . , )
=1
(14)
( =1 ( = ))
(, 1 , . . . , ) ()
( = )
=1
(15)
The symmetrized form of Eq. (14) is:
( )
1
(1 , . . . , )
()
=1
(16)
(1 , 2 )
(1 . . . ) =
or in an equivalent manner:
(1 2 ) =
1 ,2 =11 =2
(2 ) (2 )
(1 ) (1 )
and (2 ) =
()
()
(1 2 ) =
(= 1 )
(= )
( = )
( = )
=1 =1
(17)
Advances and Applications of DSmT for Information Fusion. Collected Works. Volume 4
( = )
( = )
=1
( = 1 1 )
( = )
...
( = 1 )
( = )
1 ,..., =1
values ( = ) = 1/ and ( = ) = 1/
can be simplified in Bayes fusion formulas Eq. (9) and Eq.
(10). Therefore, Bayes fusion formula (9) reduces to:
=1
( = 1 ) ( = 2 )
(19)
(20)
( )
(21)
(1 . . . ) = =1
=1
=1 ( = )
When () is uniform and from Eq. (19), one can redefine
the global agreement and the global conflict as:
( = 1 ) ( = 2 )
(22)
( = 1 ) ( = 2 )
(23)
,=1=
=1
( = 1 ) ( = 2 )
,=1
( = 1 ) ( = 2 )
,=1=
( = 1 ) ( = 2 )
,=1=
(1 ) (2 )
(1 ) (2 )
1 2
(24)
By a direct extension, one will have:
( )
( )
= =1
(1 . . . ) = =1
1
(25)
=
( = 1 1 ) . . . ( = )
=
( = 1 1 ) . . . ( = )
1 ,..., ()
1 ... =
=
( = 1 1 ) . . . ( = )
1 ,..., ()
1 ... =
= 1
(1 2 ) =
1=(
( = 1 ))(
( = 2 ))
=1
( = )
( = 1 1 )
...
( = 1 )
( = )
(1 ) (2 )
Because
=1 ( =
=1 ( = 1 ) = 1 and
2 ) = 1, then
,=1=
197
Advances and Applications of DSmT for Information Fusion. Collected Works. Volume 4
(26)
() (= 1 ) (= 1 )
=1
(= )
123 =
+
= 0.3750
3
3
0.2 3 0.2 3 0.2
0.8 3 0.8 3 0.8
Let us compute the fusion of (1 ) with (2 ) using
( (1 ), (2 ); ()). One has:
{
0.5 0.3077
( = 1 1 2 ) = 112 0.1
0.2 0.2
1 0.9 0.5
( = 2 1 2 ) = 12 0.8 0.8 0.6923
where the normalization constant 12 is given by:
0.9 0.5
0.1 0.5
+
= 0.8125
12 =
0.2 0.2
0.8 0.8
Let us compute the fusion of (2 ) with (3 ) using
( (2 ), (3 ); ()). One has
{
0.6 0.8571
( = 1 2 3 ) = 123 0.5
0.2 0.2
1 0.5 0.4
( = 2 2 3 ) = 23 0.8 0.8 0.1429
where the normalization constant 23 is given by:
0.5 0.4
0.5 0.6
+
= 1.75
23 =
0.2 0.2
0.8 0.8
Let us compute the fusion of (1 ) with (3 ) using
( (1 ), (3 ); ()). One has:
{
0.6 = 0.4
( = 1 1 3 ) = 113 0.1
0.2 0.2
0.4 = 0.6
( = 2 1 3 ) = 113 0.9
0.8 0.8
where the normalization constant 13 is given by:
0.9 0.4
0.1 0.6
+
= 0.75
13 =
0.2 0.2
0.8 0.8
Let us compute the fusion of (1 2 ) with (3 )
using ( (1 2 ), (3 ); ()). One has
{
0.3077
1
0.6 0.7273
( = 1 (1 2 ) 3 ) = (12)3
0.2
0.2
( = 2 (1 2 ) 3 ) =
0.2727
+
1.26925
(12)3 =
0.2 0.2
0.8 0.8
Let us compute the fusion of (1 ) with (2 3 )
using ( (1 ), (2 3 ); ()). One has
{
1
0.1 0.8571
0.7273
( = 1 1 (2 3 )) = 1(23)
0.2
0.2
( = 2 1 (2 3 )) =
1
0.6923
0.4
(12)3
0.8
0.8
1
0.9 0.1429
1(23) 0.8
0.8
0.2727
+
0.58931
1(23) =
0.2 0.2
0.8 0.8
1
0.6 0.5
(13)2 0.8 0.8
0.2727
Advances and Applications of DSmT for Information Fusion. Collected Works. Volume 4
+
= 1.375
(13)2 =
0.2 0.2
0.8 0.8
Therefore, one sees that even if in our example one has
(, (, )) = ( (, ), ) = (, (, )) because
((1 2 ) 3 ) = (1 (2 3 )) = (2
(1 3 )), Bayes fusion rule is not associative since:
((1 2 ) 3 ) = (1 2 3 )
(1 (2 3 )) = (1 2 3 )
(2 (1 3 )) = (1 2 3 )
(P4) : Bayes fusion rule is associative if and only if ()
is uniform.
(1 . . . ) =
=1 ( )
1
=1 ( =
=1 ( = )
(1 . . . ) =
( )
=1
1
(= )
=1
=1
1
(= )
=1
1
(= )
=1
=1
=1
( )
( = )
(1 . . . ) =
=1
(1 . . . 1 ) ( )
( = 1 . . . 1 ) ( = )
Applying Bayes fusion rule given by Eq. (5), one gets for
( (1 ), (2 ); ()):
{
0.2
= 1/3
( = 1 1 2 ) = 0.2+0.4
(28)
0.4
( = 2 1 2 ) = 0.2+0.4
= 2/3
Similarly, one gets for ( (1 ), (2 ); ())
{
0.1
= 1/3
( = 1 1 2 ) = 0.1+0.2
(29)
0.2
( = 2 1 2 ) = 0.1+0.2 = 2/3
Therefore, one sees that ( (1 ), (2 ); ()) =
( (1 ), (2 ); ()) even if the levels of
global agreements (and global conflicts) are different. In this
particular example, one has:
{
(2 = 0.60) = (2 = 0.30)
(30)
(2 = 1.60) = (2 = 2.05)
In summary, different sets of sources to combine (with different levels of global agreement and global conflict) can provide
exactly the same result once combined with Bayes fusion
rule. Hence the different levels of global agreement and global
conflict do not really matter in Bayes fusion rule. What really
matters in Bayes fusion rule is only the distribution of all the
relative agreement factors defined as ( = )/ .
III.
B ELIEF FUNCTIONS
AND
D EMPSTER S RULE
Advances and Applications of DSmT for Information Fusion. Collected Works. Volume 4
A. Belief functions
Let be a frame of discernment of a problem under
consideration. More precisely, the set = {1 , 2 , . . . , }
consists of a list of exhaustive and exclusive elements ,
= 1, 2, . . . , . Each represents a possible state related to
the problem we want to solve. The exhaustivity and exclusivity
of elements of is referred as Shafers model of the frame
. A basic belief assignment (bba), also called a belief mass
function, (.) : 2 [0, 1] is a mapping from the power set
of (i.e. the set of subsets of ), denoted 2 , to [0, 1], that
verifies the following conditions [4]:
() = 0
and
() = 1
(31)
From any bba (.), the belief function (.) and the
plausibility function (.) are defined for 2 as:
() = 2 ( )
() = 2 = ( )
12... ()
1 12...
(33)
12... ()
1 (1 )2 (2 ) . . . ( )
1 ,2 ,..., 2
1 2 ... =
(34)
and where the global conflict is given by:
12...
1 (1 )2 (2 ) . . . ( ) (35)
1 ,2 ,..., 2
1 2 ... =
(32)
200
A NALYSIS
OF COMPATIBILITY OF
WITH BAYES RULE
D EMPSTER S RULE
Advances and Applications of DSmT for Information Fusion. Collected Works. Volume 4
1 (.) {1 ( ) = ( = 1 ), = 1, 2, . . . , }
..
..
..
.
.
.
(.) { ( ) = ( = ), = 1, 2, . . . , }
(36)
The prior information is characterized by a given bba denoted
as 0 (.) that can be defined either on 2 , or only on if
we want to deal for the needs of our analysis with a Bayesian
prior. In the latter case, if 0 (.) {0 ( ) = ( = ), =
1, 2, . . . , } then 0 (.) plays the same role as the prior pmf
() in the probabilistic framework.
When considering a non vacuous prior 0 (.) = (.), we
denote Dempsters combination of sources symbolically as:
(.) = (1 (.), . . . , (.); 0 (.))
When the prior bba is vacuous 0 (.) = (.) then 0 (.)
has no impact on Dempsters fusion result, and so we denote
symbolically Dempsters rule as:
(.) = (1 (.), . . . , (.); (.))
= (1 (.), . . . , (.))
A. Case 1: Uniform Bayesian prior
It is important to note that Dempsters fusion formula
proposed by Shafer in [4] and recalled in Eq. (33) makes no
real distinction between the nature of sources to combine (if
they are posterior or prior information). In fact, the formula
(33) reduces exactly to Bayes rule given in Eq. (25) if the bbas
to combine are Bayesian and if the prior information is either
uniform or vacuous. Stated otherwise the following functional
equality holds
(1 (.), . . . , (.); 0 (.))
( (1 ), . . . , ( ); ())
(37)
1 (1 ) = ( = 1 1 ) = 0.2
2 (1 ) = 0.5
( ) = ( = 2 1 ) = 0.3 and
( ) = 0.1
1 2
2 2
1 (3 ) = ( = 3 1 ) = 0.5
2 (3 ) = 0.4
1
1 (1 )2 (1 )0 (1 2 3 )
(1 ) = 1
12
0.33 0.3030
( ) =
1
(
)
(2 )0 (1 2 3 )
2
1
2
2
112
0.03
1
(
)
= 1
1 (3 )2 (3 )0 (1 2 3 )
12
1
= 10.67 0.5 0.4 1 = 0.20
0.33 0.6061
201
with
12
= 1 1 (1 )2 (1 )0 (1 2 3 )
1 (2 )2 (2 )0 (1 2 3 )
1 (3 )2 (3 )0 (1 2 3 ) = 0.67
1
(1 ) = 1
1 (1 )2 (1 )0 (1 )
12
= 10.89
0.2 0.5 1/3 = 0.10/3
0.11 0.3030
1
(2 ) =
1 (2 )2 (2 )0 (2 )
1
12
(3 ) =
0.03/3
1
10.89 0.3 0.1 1/3 = 0.11 0.0909
1
1 (3 )2 (3 )0 (3 )
112
0.20/3
1
10.89 0.5 0.4 1/3 = 0.11 0.6061
Advances and Applications of DSmT for Information Fusion. Collected Works. Volume 4
0.20.5/0.6
2 (1 )
0.1667
2 (3 )
0.50.4/0.1
(3 1 2 ) = 2 = 2.2667 = 2.0000
2.2667 0.8824
0.060
1
0.020
1
( ) =
3
10.9110 0.5 0.4 0.1 = 0.089 0.2247
Therefore, one has in general6 :
(38)
C ONCLUSIONS
202
R EFERENCES
[1] L.A. Zadeh, On the validity of Dempsters rule of combination, Memo
M79/24, Univ. of California, Berkeley, CA, U.S.A., 1979.
[2] A. Dempster, Upper and lower probabilities induced by a multivalued
mapping, Ann. Math. Statist., Vol. 38, pp. 325339, 1967.
[3] A. Dempster, A generalization of bayesian inference, J. R. Stat. Soc. B
30, pp. 205247, 1968.
[4] G. Shafer, A Mathematical theory of evidence, Princeton University
Press, Princeton, NJ, U.S.A., 1976.
[5] L.A. Zadeh, Book review: A mathematical theory of evidence, The Al
Magazine, Vol. 5, No. 3, pp. 81-83, 1984.
[6] L.A. Zadeh, A simple view of the Dempster-Shafer theory of evidence
and its implication for the rule of combination, The Al Magazine, Vol.
7, No. 2, pp. 8590, 1986.
[7] J. Lemmer, Confidence factors, empiricism and the Dempster-Shafer
theory of evidence, Proc. of UAI 85, pp. 160176, Los Angeles, CA,
U.S.A., July 1012, 1985.
[8] J. Pearl, Do we need higher-order probabilities, and, if so, what do they
mean?, Proc. of UAI 87, pp. 4760, Seattle, WA, U.S.A., July 1012,
1987.
[9] F. Voorbraak, On the justification of Demspsters rule of combination,
Dept. of Phil., Utrecht Univ., Logic Group, Preprint Ser., No. 42, Dec.
1988.
[10] G. Shafer, Perspectives on the theory and practice of belief functions,
IJAR, Vol. 4, No. 56, pp. 323362, 1990.
[11] J. Pearl, Reasoning with belief functions: an analysis of compatibility,
IJAR, Vol. 4, No. 56, pp. 363390, 1990.
[12] J. Pearl, Rejoinder of comments on Reasoning with belief functions:
An analysis of compatibility, IJAR, Vol. 6, No. 3, pp. 425443, May
1992.
[13] G.M. Provan, The validity of Dempster-Shafer belief functions, IJAR,
Vol. 6., No. 3, pp. 389399, May 1992.
[14] P. Wang, A defect in Dempster-Shafer theory, Proc. of UAI 94, pp.
560566, Seattle, WA, U.S.A., July 2931, 1994.
[15] A. Gelman, The boxer, the wrestler, and the coin flip: a paradox of
robust bayesian inference and belief functions, American Statistician,
Vol. 60, No. 2, pp. 146150, 2006.
[16] F. Smarandache, J. Dezert (Editors), Applications and advances of
DSmT for information fusion, Vol. 3, ARP, U.S.A., 2009.
http://fs.gallup.unm.edu/DSmT.htm
[17] J. Dezert, P. Wang, A. Tchamova, On the validity of Dempster-Shafer
theory, Proc. of Fusion 2012 Int. Conf., Singapore, July 912, 2012.
[18] A. Tchamova, J. Dezert, On the Behavior of Dempsters rule of
combination and the foundations of Dempster-Shafer theory, (best paper
award), Proc. of 6th IEEE Int. Conf. on Intelligent Systems IS 12, Sofia,
Bulgaria, Sept. 68, 2012.
[19] R.P. Mahler, Statistical multisource-multitarget information fusion,
Chapter 4, Artech House, 2007.
[20] X.R. Li, Probability, random signals and statistics, CRC Press, 1999.
[21] P.E. Pfeiffer, Applied probability, Connexions Web site, Aug. 31, 2009.
http://cnx.org/content/col10708/1.6/
[22] G. Shafer, Non-additive probabilities in the work of Bernoulli and
Lambert, in Archive for History of Exact Sciences, C. Truesdell (Ed.),
Springer-Verlag, Berlin, Vol. 19, No. 4, pp. 309370, 1978.
[23] R.P. Mahler, Using a priori evidence to customize Dempster-Shafer
theory, Proc. of 6th Nat. Symp. on Sensor Fusion, Vol. 1, pp. 331
345, Orlando, FL, U.S.A., April 1315, 1993.
[24] R.P. Mahler, Classification when a priori evidence is ambiguous, Proc.
SPIE Conf. on Opt. Eng. in Aerospace Sensing (Automatic Object
Recognition IV), pp. 296304, Orlando, FL, U.S.A., April 48, 1994.
[25] D. Fixsen, R.P. Mahler, The modified Dempster-Shafer approach to
classification, IEEE Trans. on SMC, Part A, Vol. 27, No. 1, pp. 96104,
1997.
Advances and Applications of DSmT for Information Fusion. Collected Works. Volume 4
I NTRODUCTION
and c(.) are respectively the set intersection and complement operators.
203
BACKGROUND
ON BELIEF FUNCTIONS
BA6=
BG
Advances and Applications of DSmT for Information Fusion. Collected Works. Volume 4
B. Fusion rules
The main information fusion problem in the belief function
frameworks (DST or DSmT) is how to combine efficiently
several distinct sources of evidence represented by m1 (.),
m2 (.), . . . , ms (.) (s 2) bbas defined on G . Many rules
have been proposed for such task see [2], Vol. 2, for a
detailed list of fusion rules and we focus here on the
following ones: 1) the Averaging Rule because it is the simplest
one and it is used to empirically estimate probabilities in
random experiment, 2) DS rule because it was historically
proposed in DST, and 3) PCR5 and PCR6 rules because they
were proposed in DSmT and have shown to provide better
results than the DS rule in all applications where they have
been tested so far. So we just briefly recall how these rules are
mathematically defined.
Averaging fusion rule mAverage
1,2,...,s (.)
For any X in G , mAverage
1,2,...,s (X) is defined by
1X
mi (X)
s i=1
(2)
Note that the vacuous bba mv () = 1 is not a neutral element
for this rule. This Averaging Rule is commutative but it is not
associative because in general
s
mAverage
1,2,...,s (X) = Average(m1 , m2 , . . . , ms ) ,
mAverage
(X) =
1,2,3
1
[m1 (X) + m2 (X) + m3 (X)]
3
where the numerator of (4) is the mass of belief on the conjunctive consensus on X, and where K1,2,...,s is a normalization
constant defined by
K1,2,...,s =
s
Y
mi (Xi ) = 1 m1,2,...,s ()
s
Y
mi (Xi )
is different from
1 m1 (X) + m2 (X)
[
+ m3 (X)]
2
2
which is also different from
1
m2 (X) + m3 (X)
mAverage
]
1,(2,3) (X) = 2 [m1 (A) +
2
and also from
1 m1 (X) + m3 (X)
mAverage
+ m2 (X)]
(1,3),2 (X) = 2 [
2
In fact, it is easy to prove that the following recursive formula
holds
s 1 Average
1
mAverage
m1,2,...,s1 (X) + ms (X) (3)
1,2,...,s (X) =
s
s
This simple averaging fusion rule has been used since more
than two centuries for estimating empirically the probability
measure in random experiments [3], [4].
mAverage
(1,2),3 (X) =
mDS
1,2,...,s (.)
In DST framework, the fusion space G equals the powerset 2 because Shafers model of the frame is assumed.
The combination of s 2 distinct sources of evidences
characterized by the bbas mi (.), i = 1, 2, . . . , s, is done with
DS rule as follows [1]: mDS
1,2,...,s () = 0 and for all X 6= in
2
s
Y
X
1
mi (Xi ) (4)
(X)
,
mDS
1,2,...,s
K1,2,...,s
i=1
X1 ,X2 ,...,Xs 2
X1 X2 ...Xs =X
204
Advances and Applications of DSmT for Information Fusion. Collected Works. Volume 4
sources is given by mP
1,2,...,s () = 0 and for X 6= in G
CR5
mP
1,2,...,s (X) = m1,2,...,s (X)+
X
2ts
Xj2 ,...,Xjt G \{X}
1r1 ,...,rt s
t1
({1,...,n})
1r1 <r2 <...<rt1 <(rt =s) {j2 ,...,jt }P
XXj2 ...Xjs =
s
{i1 ,...,is }P ({1,...,s})
Qr
Qt Qr
( k11 =1 mik1 (X)2 ) [ l=2 ( kll =rl1 +1 mikl (Xjl )]
Q
Q
P
( rk11 =1 mik1 (X)) + [ tl=2 ( rkll =rl1 +1 mikl (Xjl )]
(5)
mP
1,2,...,s () = 0 and for X 6= in G
Y G \{X}
XY =
X1 ,X2 G
X1 X2 =X
m1 (X)2 m2 (Y )
m2 (X)2 m1 (Y )
+
] (8)
m1 (X) + m2 (Y ) m2 (X) + m1 (Y )
CR6
mP
1,2,...,s (X) = m1,2,...,s (X)+
s
X
mi (X)2
i=1
s1
Yi (k) X
k=1
(Yi (1) ,...,Yi (s1) )(G )s1
s1
Y
mi (j) (Yi (j) )
j=1
s1
X
m (X)+
mi (j) (Yi (j) )
i
j=1
of
(7)
expert/source
i,
j=1
205
Advances and Applications of DSmT for Information Fusion. Collected Works. Volume 4
that is
i1 ,Xi2 ,...,Xik G \X
T
( k
X
)X=
i
j=1
j
CR5
xP CR5
0.018
xP
A
= B
=
0.02857
0.6
0.03
0.6 + 0.03
Thus
III.
CR5
xP
= 0.60 0.02857 0.01714
A
P CR5
xB
= 0.03 0.02857 0.00086
that is
CR6
xP
xP CR6
0.018
A
= B
=
= 0.018
0.6
0.3 + 0.1
0.6 + (0.3 + 0.1)
F1
.
.
.
F2
.
.
.
...
...
...
.
.
.
...
Fn
.
.
.
Advances and Applications of DSmT for Information Fusion. Collected Works. Volume 4
Table II.
F1
1
1
.
.
.
1
0
0
.
.
.
0
F2
0
0
.
.
.
0
1
1
.
.
.
1
...
...
...
.
.
.
...
...
...
.
.
.
...
Fn
0
0
.
.
.
0
0
0
.
.
.
0
0
0
.
.
.
0
0
0
.
.
.
0
.
.
.
mu1 (.)
mu2 (.)
.
.
.
muin (.)
.
.
.
0
0
.
.
.
0
.
.
.
0
0
.
.
.
0
.
.
.
...
...
.
.
.
...
.
.
.
1
1
.
.
.
1
.
.
.
0
0
.
.
.
0
m1,2,...,s (.)
...
xF1
xF2
=
= ...
1 + 1 + ... + 1
1 + 1 + ...+ 1
|
{z
}
|
{z
}
(i ,i ,...,ik )P s ({1,...,s})
i1 ,Xi2 ,...,Xik G \X 1 2
Tk
( j=1 Xij )X=
1 1 ... 1 1 ... 1
[1 + 1 + . . . + 1]
1 +1 + ... + 1 + 1 + ... + 1
i1 times
i2 times
xFn
m1,2,...,s ()
1
=
=
=
1 + 1 + ...+ 1
i1 + i2 + . . . + in
s
|
{z
}
(10)
in times
s1
X
k=1
1
k
s
which is equal to k/s since there is only one possible nonnull product of the form m1 (X1 )m2 (X2 ) . . . ms (Xs ), and all
other products are equal to zero. Therefore, we finally get:
k
CR6
mP
1,2,...,s (X) =
s
(11)
x F2
1 1 ... 1
|
{z
}
i2 times
= ... =
x Fn
m1,2,...,s ()
1
=
=
1 1 ... 1
1 + 1 + ... + 1
n
|
{z
}
|
{z
}
in times
n times
Advances and Applications of DSmT for Information Fusion. Collected Works. Volume 4
Table III.
A
1
0
0
0
B
0
1
0
0
AB
0
0
1
0
AB =
1
1
(1 + 0 + 0) =
3
3
1
1
Average
m1,2,3 (A B) = (0 + 1 + 0) =
3
3
1
1
Average
m1,2,3 (A B C) = (0 + 0 + 1) =
3
3
mAverage
(A) =
1,2,3
Table V.
1
1
CR6
mP
=
1,2,3 (A) = m1,2,3 (A) + xA = 0 +
3
3
1
1
P CR6
m1,2,3 (B) = m1,2,3 (B) + yB = 0 + =
3
3
CR6
mP
1,2,3 (A B) = m1,2,3 (A B) + zAB = 0 +
1
1
=
3
3
0.144
0.336
0.056
0.224
AB
0
1
0
0
AB =
0
0
0
0.820
0.024
A
1
0
0
1
B
0.8
0.4
0.3
0.096
A
0.2
0.6
0.7
0.084
mAverage
(A) =
1,2,3
Table IV.
Clearly, in this case the focal elements are nested and the
condition on emptiness of intersection of all focal elements is
not satisfied because one has A (A B) (A B C) =
A 6= , so that the theorem cannot be applied in such case. The
total conflicting mass is not 1. One can verify in such example
that PCR6 rule differs from the Averaging Rule because one
gets
CR6
mP
1,2,3 (A) = m1,2,3 (A) = 1
CR6
mP
1,2,3 (A B) = m1,2,3 (A B) = 0
P CR6
m1,2,3 (A B C) = m1,2,3 (A B
C) = 0
Advances and Applications of DSmT for Information Fusion. Collected Works. Volume 4
IV.
CR6
mP
1,2,3 (B) = 0.096 +
6
X
i=1
6
X
xi (A) = 0.495634
yi (A) = 0.504366
i=1
1.5
= 0.5
3
1.5
= 0.5
3
n(A)
P (A|n(A), n) =
n
mAverage
(B)
1,2,3
= 0.5
A
0.2
0.5
0.8
0.08
B
0.8
0.5
0.2
0.08
H
1
1
0
1
0
1
1
0
T
0
0
1
0
1
0
0
1
(13)
Table VII.
Average
CR6
mP
(A) = 0.5
1,2,3 (A) = m1,2,3
CR6
mP
1,2,3 (B)
TO PROBABILITY ESTIMATION
x5 (A)
y5 (B)
0.056
=
=
0.2 + 0.7
0.4
0.4 + 0.2 + 0.7
A PPLICATION
AB =
0
0
0
0.84
209
Advances and Applications of DSmT for Information Fusion. Collected Works. Volume 4
3
n(T )
=
by eq. (13)
P (T |{o1 , o2 , . . . , o8 }) =
n
8
1
= (0 + 0 + 1 + 0 + 1 + 0 + 0 + 1)
8
= mAverage
by eq. (2)
1,2,...,8 (T )
Because all the bbas to combine here are binary and are in
total conflict, our theorem 1 of Section III applies, and PCR6
fusion rule in this case coincides with the averaging fusion
rule. Therefore we can use PCR6 to estimate the probabilities
that the coin will land on H or T at the next toss given the
series of observations. More precisely,
(
Average
CR6
mP
1,2,...,8 (H) = m1,2,...,8 (H) = P (H|{o1 , o2 , . . . , o8 })
Average
P CR6
m1,2,...,8 (T ) = m1,2,...,8 (T ) = P (T |{o1 , o2 , . . . , o8 })
[1]
[2]
[3]
[4]
[5]
[6]
[7]
[8]
[9]
xH
yT
m1,2,...,8 ()
1
=
=
=
= 0.5
11111
111
(1 1 1 1 1) + (1 1 1)
1+1
[10]
[11]
[12]
C ONCLUSIONS
AND CHALLENGE
[13]
[14]
[15]