Professional Documents
Culture Documents
Doctor of Philosophy
(Mathematics)
May, 2015
SUBMITTED BY
PARUL AGARWAL
M. Sc., M. Phil. (Mathematics)
2015
ACKNOWLEDGMENTS
The precious gift of learning is a debt that is difficult to pay. Only gratitude can
be left. Indeed, the words at my command are inadequate in form or in sprit to
express my heartfelt, deep sense of unbound gratitude and indebtedness to Dr.
H. S. Nayal, Associate Professor, Department of mathematics, Govt.
Postgraduate College, Ranikhet (Almora), Uttarakhand. I feel golden
opportunity and proud privilege to work under the inspiring guidance,
imperative and facial suggestions of Dr. H. S. Nayal, who is not just a guide,
but an angelic to me. He has not only helped me in my research work but also
provided benevolent guidance during the whole degree programme. I have real
appreciation and regard for him for his magnanimous and cooperative attitude,
painstaking efforts, peerless criticism and especially for realizing to capitalize
on my strength to work hard and acquire confidence to deal with every
difficulty.
I have deepest respect and regards for my father, Shri R. N. Agarwal and
mother, Smt. Kusum Agarwal for their encouragement, motivation and valuable
guidance during this academic venture. I dont have words to express my
gratitude and indebtedness to my brother, Mr. Saurabh Agarwal, who has
always motivated and inspired me to work. From the deepest core of my heart, I
express very affectionate thanks to my sister, Dr. Nitu Agarwal, whose emotive
support and care enabled me to bring this work in shape. The love, affection
and encouragement that I got from my family are unforgettable. They are my
strength, my support and my pride.
Ranikhet
April,2015
(Parul Agarwal)
Authoress
Contents
CHAPTER 1
Introduction
1.1 Measure
1.2 Fuzzy
1.3 Uncertainty
1.4 Fuzzy measure
1.5 Null-additive fuzzy measure
CHAPTER 2
Possibility Theory versus Probability Theory in Fuzzy Measure Theory
2.1 Concept of associative probabilities to a fuzzy measure
2.2 Shannon Entropy of fuzzy measure
2.3 Basic properties of entropy of fuzzy measure
2.4 Application of entropy of fuzzy measure
CHAPTER 3
Properties of Null-Additive and Absolute Continuity of a Fuzzy Measure
3.1 Absolute continuity of a fuzzy measure
3.2 Some results on null-additive fuzzy measure
3.3 Lebesgue decomposition type null-additive fuzzy measure
3.4 Generalization of the symmetric fuzzy measure
3.5 Properties of fuzzy measures
CHAPTER 4
Properties of Strong Regularity of Fuzzy Measure on Metric Space
4.1 Null additive fuzzy measure on metric space
4
CHAPTER 5
Continuous, auto-continuous and completeness of fuzzy measure
5.1 Continuous and auto-continuous of fuzzy measure
5.2 Results on completeness of fuzzy measure space
5.3 Convergence in fuzzy measure
5.4 Some other properties of fuzzy measure and convergence
BIBLIOGRAPHY
Introduction
1.1
Measure:
Measure theory was developed in successive stage during the late 19 th and early
20th centuries by
Frchet, among others, by [100], [18], [31]. The main applications of measures
are in the foundations of the Lebesgue integral [64], in Andrey Kolmogorovs
axiomatisation of probability theory and in ergodic theory [42]. In integration
theory, specifying a measure allows one to define integrals on spaces more
general than subsets of Euclidean space; moreover, the integral with respect to
the Lebesgue measure on Euclidean spaces is more general and has a richer
theory than its predecessor, the Riemann integral (see [29]). Probability theory
considers measures that assign to the whole set of size 1, and considers
measurable subsets to be events whose probability is given by the measure by
[86]. Ergodic theory considers measures that are invariant under, or arise
naturally from, a dynamical system [42].
Technically, a measure is a function that assigns a non-negative real number or
+
to certain subsets of a set X [42]. It must assign 0 to the empty set and
algebra [18], this means that countable unions, countable intersections and
complements of measurable subsets are measurable.
1.1.1
Definition:
from to the
Non-negativity:
( E)0
(b)
( )=0
(c)
for all
- additivity):
( i I Ei )= ( E i ) (1 a)
i I
One may require that at least one set E has finite measure. Then the null set
automatically has measure zero because of countable additivity, because
i=0
i=0
( E )= ( E .. )= ( E ) + ( ) ( ) ,(1 b)
is finite if and only if the empty set has measure zero. If only the second and
third conditions of the definition of measure above are met, and
takes on
[18].
1.1.2
Properties:
is monotonic: If
E1
and
E2
are
then,
( E1 ) ( E2 )
(1c)
A measure
( Ei ) ( Ei ).(1 d)
i=1
A measure is continuous from below: If E1, E2, E3, . are measurable sets
and En is a subset of En+1 for all n, then the union of the sets E n is measurable,
and
i=1
( Ei ) =lim ( Ei) .(1 e)
i
(d)
A measure
sets and En+1 is a subset of En for all n, then the intersection of the sets E n is
measurable; furthermore, if at least one of the En has finite measure, then
i=1
Ei ) =lim ( Ei) .(1 f )
i
This property is false without the assumption that at least one of the E n has
n N , let
En= R
, which all
space has
measure.
For example, the real numbers with the standard Lebesgue measure are
finite but not finite. Consider the closed intervals [k, k+1] for all integers k;
there are countably many such intervals, each has measure 1, and their union is
the entire real line. Alternatively, consider the real numbers with the counting
measure, which assigns to each finite set of real numbers i.e. number of points
in the set. This measure space is not
measure contains only finitely many points, and it would take uncountably
many such set to cover the entire real line. The
this respect to the Lindelf property of topological spaces. They can be also
thought of as a vague generalization of the idea that a measure space may have
uncountable measure.
II
10
any
<
X , <
additive if for
[ a ] X
[ b] ( X )= ( X ) .
Note that the second condition is equivalent to the statement that the ideal of
null sets is
translation-invariant measure on a
11
such that
Lebesgue measure.
The Hausdorff measure is a generalization of the Lebesgue measure to sets with
non-integer dimension, in particular, fractal sets. The Haar measure for a locally
compact topological group is also a generalization of the Lebesgue measure
(and also of counting measure and circular angle measure) and has similar
uniqueness properties. Circular angle measure is invariant under rotation, and
hyperbolic angle measure is invariant under squeeze mapping.
Other measures used in various theories include: Borel measure, Jordan
measure, Ergodic measure, Euler measure, Gaussian measure, Baire measure,
Radon measure, Young measure and so on by [31], [79].
1.1.4. Generalizations: For certain purposes, it is useful to have a measure
whose values are not restricted to the non-negative real or infinity. For instance,
a countably additive set function with values in the (signed) real numbers is
called a signed measure, while such a function with values in the complex
numbers is called a complex measure. Measures that take values in Banach
spaces have been studied extensively by [42]. A measure that takes values in the
set of self-adjoint projections on a Hilbert space is called a projection-valued
measure; these are used in functional analysis for the spectral theorem. When it
is necessary to distinguish the usual measure, which take non-negative values
from generalizations, the term positive measure is used. Positive measures are
12
closed under conical combination but not general linear combination, while
signed measures are the linear closure of positive measures [100].
Another generalization is the finitely additive measure, which are sometimes
called contents. This is the same as a measure except that instead of requiring
countable additivity we require only finite additivity. Finite additive measures
and the
Stone-Cech compactification by [9]. All these are linked in one way or another
to the axiom of choice.
1.2.
Fuzzy:
and union
A B
are
defined as follows:
Instead of min and max one can use t-norm and t-conorm, respectively, by [24]
for example, min (a, b) can be replaced by multiplication ab. A straightforward
fuzzification is usually based on min and max operations because in this case
more properties of traditional mathematics can be extended to the fuzzy case.
A very important generalization principle used in fuzzification of algebraic
operations is a closure property. Let * be a binary operation on X. The closure
property
for
fuzzy
subsets
x , y X , A ( xy ) min ( A ( x ) , A ( y ) ) .
of
is
that
for
all
14
range defined by
15
16
1.3.
Uncertainty:
Fuzzy concepts can generate uncertainty because they are imprecise (especially
if they refer to a process in motion or a process of transformation where
something is in the process of turning into something else). In that case, they
do not provide a clear orientation for action or decision-making, reducing
fuzziness, perhaps by applying fuzzy logic, would generate more certainty [75].
A fuzzy concept may indeed provide more security, because it provides a
17
18
19
where
(1g)
positive constants
(1h)
20
One bit of uncertainty is equivalent to the total uncertainty regarding the truth
or falsity of one proposition. The set function U defined by equation (1h) be
called a Hartley function. Its uniqueness as a measure of uncertainty
associated with sets of alternatives can also be proven axiomatically.
A natural generalization of the Hartley function from classical set theory to
fuzzy set theory was proposed in the early eighties under the name Uuncertainty. For any nonempty fuzzy set A defined on a finite universal set X,
the generalized Hartley function [39] has the form
U ( A )=
1
h( A)
h (A )
where
- cuts of the
preceding
of a given
-cut. Fuzzy sets are equal when normalized have the same
21
1.3.2. Fuzziness (or Vagueness): The second type of uncertainty that involves
fuzzy sets (but not crisp sets) is fuzziness (or vagueness). In general, a measure
f : P(X) R ,
of fuzziness is a function
fuzzy subsets of X (fuzzy power set). For each fuzzy set A, this function assign
a nonnegative real number f(A) that express the degree to which the boundary
of A is not sharp [37].
One way is to measure fuzziness of any set A by a metric distance between its
membership grade function and the membership grade function (or
characteristic function) of the nearest crisp set [60]. Even when committing to
this conception of measuring fuzziness, the measurement is not unique. To
make it unique, we have to choose a suitable distance function.
Another way of measuring fuzziness, which seems more practical as well as
more general, is to view the fuzziness of a set in terms of the lack of distinction
between the set and it complement. Indeed, it is precisely the lack of distinction
between the sets and their complements that distinguishes fuzzy sets from crisp
sets [60]. The less a set differs from its complement, the fuzzier it is. Let us
restrict our discussion to this view of fuzziness, which is currently predominant
in the literature.
1.3.3. Strife (or discord): This type of uncertainty which is connected with
conflicts among evidential claims has been far more controversial. Although it
22
D ( m) = m ( A ) log2 1 m ( B )
A F
B F
|B A|
(1 j)
|B|
The rationale for choosing this function is explained as follows. The term,
Con ( A ) = m(B)
B F
|B A|
( 1k )
|B|
B F
log 2 [1Con ( A ) ] ,
23
which is employed in equation (1j), is monotonic increasing with Con (A) and,
consequently, it represents the same quantity as Con(A), but on the logarithmic
scale. The use of the logarithmic scale is motivated in the same way as in the
case of the Shannon entropy [39].
Function D is clearly a measure of the average conflict among evidential
claims within a given body of evidence. Consider, as an example, incomplete
information regarding the age of a person X. Assume that the information is
expressed by two evidential claims pertaining to the age of X: X is between 15
and 17 years old with degree m(A), where A = {15, 17}, and X is a teenager
with degree m(B), where B = {13, 19}. Clearly, the weaker second claim does
not conflict with the stronger first claim. Assume that
A B , in this case,
the situation is inverted: the claim focusing on B is not implied by the claim
focusing on A and, consequently, m(B) dose conflict with m(A) to a degree
proportional to number of elements in A that are not covered by B. This conflict
is not captured by function Con since |B A|=0
in this case.
It follows from these observations that the total conflict of evidential claims
within a body of evidence {F, m} with respect to a particular claim m(A) should
be expressed by function
CON ( A )= m ( B )
B F
| AB|
(1 l)
|A|
24
rather than function Con given by equation (1k). Replacing Con (A) with CON
(A), we obtain a new function, which is better justified as a measure of conflict
in evidence theory than function D. This new function, which is called strife
and denoted by S, is defined by the form,
S ( m) = m( A)log 2 1 m( B)
A F
B F
| AB|
.(1 m)
| A|
1=r 1 r 2 r n
[ ]
i
j=1
r j ,(1 n)
25
26
27
1.4.
Fuzzy measure:
28
measures used in this theory are plausibility and belief measures, fuzzy set
membership function and the classical probability measures. In the fuzzy
measure theory, the conditions are precise, but the information about an element
alone is insufficient to determine which special classes of measure should be
used [44]. The central concept of fuzzy measure theory is the fuzzy measure
which was introduced by Choquet in 1953 by [44] and independently defined
by Sugeno in 1974 by [72] in the context of fuzzy integrals.
Consider, however, the jury members for a criminal trial who are uncertain
about the guilt or innocence of the defendant. The uncertainty in this situation
seems to be of a different type; the set of people who are guilty of the crime and
the set of innocent people are assumed to have very distinct boundaries. The
concern, therefore, is not with the degree to which the defendant is guilty, but
with the degree to which the evidence proves his membership in either the crisp
set of guilty people or the crisp set of innocent people. We assume that perfect
evidence would point to full membership in one and only one of these sets.
However, our evidence is rarely, if ever, perfect, and some uncertainty usually
prevails. In order to represent this type of uncertainty, we could assign a value
to each possible crisp set to which the element in question might belong. This
value would indicate the degree of evidence or certainty of the elements
membership in the set. Such a representation of uncertainty is known as a
fuzzy measure.
29
In classical, two-valued logic, we would have to distinguish cold from not cold
by fixing a strict changeover point. We might decide that anything below 8
degrees Celsius is cold, and anything else is not cold. This can be rather
arbitrary. Fuzzy logic lets us avoid having to choose an arbitrary changeover
point essentially by allowing a whole spectrum of degrees of coldness. A set
of temperature like hot or cold is represented by a function. Given a
temperature, the function will return a number representing the degree of
membership of that set. This number is called a fuzzy measures.
1.4.1
Example:
Temperature in 0C
-273
-40
0
5
10
15
100
1000
Cold
Cold
Not quite cold
On the cold side
A bit cold
Barely cold
Not cold
Not cold
The basis of the idea of coldness may be how people use the word cold
perhaps. 30% of people think that 100C is cold (function value 0.3) and 90%
think that 00C is cold (function value 0.9). Also it may depend on the context. In
terms of the weather, cold means one thing. In terms of the temperature of the
coolant in a nuclear reactor cold may means something else, so we would
need to have a cold function appropriate to our context.
1.4.2
Definition:
30
measure on X,
is a function
of subsets of X, a fuzzy
g : [0,1]
Boundary requirements:
[2]
Monotonicity: If
[3]
Continuity
g ( )=0g ( X )=1
from
below:
For
any
increasing
sequence
any
decreasing
sequence
A 1 A 2 ,
i=1
if i=1 Ai , then lim g( Ai )=g ( Ai )
i
[4]
Continuity
from
above:
For
A 1 A 2 ,
i=1
if i=1 Ai , then lim g ( A i )=g ( A i )
i
The boundary requirements [1] state that the element in question definitely
does not belong to the empty set and definitely does belong to the universal set.
The empty set does not contain any element hence it cannot contain the element
31
of our interest, either; the universal set contains all elements under
consideration in each particular context; therefore it must contain our element
as well.
Requirement [2] states that the evidence of the membership of an element in a
set must be at least as great as the evidence that the element belongs to any
subset of that set. Indeed with some degree of certainty that the element belongs
to a set, then our degree of certainty that is belongs to a larger set containing the
former set can be greater or equal, but it cannot be smaller. Requirements [3]
and [4] are clearly applicable only to an infinite universal set. They can
therefore be disregarded when the universal set is finite. Fuzzy measures are
usually defined on families
semirings,
32
g ( X )=1 . These generalizations are not desirable for our purpose. Third,
fuzzy measures are generalizations of probability measures or generalizations
of classical measures. The generalization is obtained by replacing the additivity
requirement with the weaker requirements of monotonicity and continuity or, at
least, semicontinuity [106].
1.4.3
A , B X , a fuzzy measure
is
1.
Additive:
if
2.
super additive:
if
g ( A B ) + g ( A B ) g ( A ) + g ( B )
for all A B= ;
3.
sub additive:
if
g ( A B ) + g ( A B ) g ( A ) + g ( B )
for all A B= ;
4.
super modular:
if
g ( A B ) + g ( A B ) g ( A ) + g ( B ) ;
5.
sub modular:
if
g ( A B ) + g ( A B ) g ( A ) + g(B) ;
6.
Symmetric:
If | A|=|B| g ( A )=g ( B ) ;
33
7.
Boolean:
If g ( A )=0g ( A )=1.
A B AA B B . The monotonicity
of fuzzy measures that every fuzzy measures g satisfies the inequality for any
three sets
A , B , A B ,
g( A B) min [g ( A ) , g ( B ) ]
(1o)
Similarly
A A BB A B
fuzzy measures implies that every fuzzy measure g satisfies the inequality for
any three sets
A ,B , A B ,
g( A B) max [g ( A ) , g ( B ) ]
(1p)
Understanding the properties of fuzzy measures is useful in application. When a
fuzzy measure is used to define a function such as the Sugeno integral or
Choquet integral, these properties will be crucial in understanding the function's
behavior. For instance, the Choquet integral with respect to an additive fuzzy
measure reduces to the Lebesgue integral [47]. In discrete cases, a symmetric
fuzzy measure will result in the ordered weighted averaging (OWA) operator.
34
variables can be quite high (2 ) . For this reason, in the context of multicriteria decision analysis and other disciplines, simplification assumptions on
the fuzzy measure have been introduced so that it is less computationally
expensive to determine and use. For instance, when it is assumed the fuzzy
measure is additive, it will hold that
g ( E )= g( {i } )( 1q)
i E
and the values of the fuzzy measure can be evaluated from the values on X.
Similarly, a symmetric fuzzy measure is defined uniquely by
values.
Two important fuzzy measures that can be used are the Sugeno- -fuzzy
measure and
35
Probability Theory:
36
prove that the answer is incorrect because in the light of additional information
a different answer is obtained.
4. The process should not introduce information that is not present in the
original statement of the problem.
1.4.4.1 Basic Terminologies Used in Probability Theory:
(I)
axioms that are used to constrain the probabilities assigned to events [95]. Four
axioms of probability are as follows:
1. All values of probabilities are between zero and one i.e.
0 P ( A ) 1 A .
2. Probabilities of an event that are necessarily true have a value of one, and
those that are necessarily false have a value of zero i.e. P(True) = 1 and P(False)
= 0.
3. The probability of a disjunction is given by:
P ( A B )=P ( A ) + P ( B )P( A B)
A ,B
37
A , B s .t . A B= .
(II)
is a function that specifies a probability of certain state of the domain given the
states of each variable in the domain. Suppose that our domain consists of three
would be
a1=1, a 2=0,a3=1
probabilities of
a1 , a2 ,a 3
38
(III)
begins to perceive data from its world, it stores this data as evidence. This
evidence is used to calculate a posterior or conditional probability which will be
more accurate than the probability of an atomic event without this evidence,
known as a prior or unconditional probability [95].
We take example to define bayes rule; the reliability of a particular skin test for
tuberculosis (TB) is as follows:
If the subject does not have TB then the specificity of the test is 0.99.
From a large population, in which 2 in every 10,000 people have TB, a person
is selected at random and given the test, which comes back positive. What is
the probability that the person actually has TB?
Lets define event A as the person has TB and event B as the person tests
positive
P ( A )=
for
TB.
It
is
clear
that
the
prior
probability
2
=0.0002P ( A )=1P ( A )=0.9998
.
10,000
The conditional probability P ( B| A ) ,the probability that the person will test
positive for TB given that the person has TB. This was given as 0.98. The other
39
value we need
)
P ( B| A
, the probability that the person will test positive for
TB given that the person does not have TB. Since a person who does not have
TB will test negative 99% (given) of the time, he/she will test positive 1% of
P ( A|B )=
)=0.01
P ( B| A
. By Bayes Rule as:
P ( A ) P ( B| A )
0.0002 0.98
=
0.0192 ( ) =1.92 ( )
We might find this hard to believe, that fewer than 2% of people who test
positive for TB using this test actually have the disease. Ever though the
sensitivity and specificity of this test are both high, the extremely low incidence
of TB in the population has a tremendous effect on the tests positive predictive
value, the population of people who test positive that actually have the disease.
To see this, we might try answering the same question assuming that the
incidence of TB in the population is 2 in 100 instead of 2 in 10,000.
(IV)
does not affect the result of another atomic event, B, those two atomic events
are known to be independent of each other and this helps to resolve the
uncertainty [13]. This relationship has the following mathematical property:
40
P ( A , B )=P ( A ) . P (B) .
P ( A|B )=P ( A ) .
Another
mathematical
implication
is
that:
conditional relationship.
1.4.4.2 Disadvantages with Probabilistic Method:
Probabilities must be assigned even if no information is available and assigns
an equal amount of probability to all such items. Probabilities require the
consideration of all available evidence, not only from the rules currently under
consideration [50]. Probabilistic methods always require prior probabilities
which are very hard to found out apriority [36]. Probability may be
inappropriate where as the future is not always similar to the past. In
probabilistic method independence of evidences assumption often not valid and
complex statements with conditional dependencies cannot be decomposed into
independent parts. In this method relationship between hypothesis and evidence
is reduced to a number [13]. Probability theory is an ideal tool for formalizing
uncertainty in situations where class frequencies are known or where evidence
is based on outcomes of a sufficiently long series of independent random
experiments [95].
1.4.5
Evidence Theory:
41
(a) m ( ) =0 , and
( b)
m( A )=1
A ( X )
The value of m(A) pertains only to the set A and makes no additional claim
about any subset of A. Any further evidence on the subset of A would be
42
( c ) Bel ( X 1 X 2 X n )
i <k
Due to the inequality (1r), belief measures are called superadditive. When X is
infinite, function Bel is also required to be continuous from above. For each
Y ( X ) , Bel(Y )
available evidence [4], that a given element of X belongs to the set Y. The
inequality (1r) implies the monotonicity requirement [2] of fuzzy measure. Let
43
Then X 1 X 3=X 2 X 1 X 3= .
Applying now
X 1 X 3 for n=2
to (1r), we get
Since
Let
X 1= AX 2 = A
) Bel ( A A ) Bel( X) 1
Bel ( A ) + Bel ( A
)1
Bel ( A ) + Bel ( A
(1s)
measure is a function
44
(a)
Pl ( )=0 ,
(b)
Pl ( X ) =1 , and
( c ) Pl ( X 1 X 2 X n )
Pl ( X i ) Pl ( X i X k ) + + (1 )n+1 Pl ( X 1 X 2 X n ) (1 t)
i
i <k
Due to the inequality (1s), plausibility measures are called subadditive. When X
is infinite, function Pl is also required to be continuous from below [43].
Let
X 1= AX 2 = A
)Pl ( A A
) Pl ( A A )
Pl ( A ) + Pl ( A
)Pl ( X ) Pl ()
Pl ( A ) + Pl ( A
) 1
Pl ( A ) + Pl ( A
(1u)
According to inequality (1s) and (1u) we say that each belief measure, Bel, is a
plausibility measure, Pl, i.e. the relation between belief measure and plausibility
measure is defined by the equation [59].
45
)
Pl ( A )=1Bel ( A
(1v)
)
Bel ( A )=1Pl ( A
1.4.6
(1w)
Possibility Theory:
with top 1 and bottom 0, such as the unit interval. The function
46
represents
h<170
and 1 if
h 170
Becomes
( A B )=min [ ( A ) , ( B ) ]
47
And
Pl ( A B )=max [ Pl ( A ) , Pl ( B ) ]
Becomes
( A B )=max [ ( A ) , ( B )]
And also
( A )=1 ( A ) ; ( A )=1 ( A )
Some other important measures on fuzzy sets and relations are defined here.
1.4.7
Additive Measure:
be a measurable space. A function
Let X,
is an - additive
2.
( )=0
If
An ,
then
n=1
( A n ) = ( A n) .
n=1
well-known example of
48
An
.A
, p)
where
the
probability
p ( X )=1 p ( A ) =1 p( A ' )
is
an
additive
measure
such
that
of -additive measure are the Lebesgue measures defined that are an important
base of the XX century mathematics [63]. The Lebesgue measures generalise
the concept of length of a segment, and verify that if
n
Other measures given by Lebesgue are the exterior Lebesgue measures and
interior Lebesgue measures. A set A is Lebesgue measurable when both interior
and exterior Lebesgue measures are the same [67]. Some examples of Lebesgue
measurable sets are the compact sets, the empty set and the real numbers set R.
1.4.8
any
that
-additive 1 X
M ( E )=0 , whenever
49
for
elements such
The
EX
[66].
1.4.9
Normal Measure
Let X,
A0
m: [ 0,1 ]
is a normal
Am
in
such that:
1.
A
m( 0)=0.
2.
m ( A m )=1.
measures with
A 0=
and
A m =1
50
are normal
Let
1.
2.
If
3.
If
An
and
A1 A2
then
-Measure
that is
51
Let (1, )
g : [ 0,1 ]
and let X,
is a fuzzy
fuzzy
1.5.
=0
then the
The concept of fuzzy measure does not require additivity , but it requires
monotonicity related to the inclusion of sets [45]. Additivity can be very
effective and convenient in some applications, but can also be somewhat
inadequate in many reasoning environments of the real world as in approximate
reasoning, fuzzy logic, artificial intelligence, game theory, decision making,
psychology, economy, data mining, etc., that require the definition of nonadditive measures and large amount of open problems [26]. For example, the
efficiency of a set of workers is being measured, the efficiency of the some
people doing teamwork is not the addition of the efficiency of each individual
working on their own.
The theory of non-additive set functions had influences on many parts of
mathematics and different areas of science and technology. Recently, many
authors have investigated different type of non-additive set functions, as sub-
52
additive set functions and others. The range of the null-additive set functions,
introduced by Wang [109]. Suzuki [45] introduced and investigated atoms of
fuzzy measures and Pap [26] introduced and discussed atoms of null-additive
set functions.
1.5.1
Definition:
[109].
m ( ) =0 whenever
1.
Examples:
- decomposable measure
conorm
whenever
, such that
m: [0,1]
m ( ) =0
A , B , and
and
with respect to a t-
m ( A B )=m ( A ) m(B)
53
2.
-decomposable measure
which is
-decomposable
measure
m ( A B )=m ( A ) m(B)
if
whenever
there
m: [a , b ] is a
hold
A , B ,
m ( ) =0
and
and
A B= .
Then m is null-additive.
3.
Let m ( A ) 0 whenever
4.
Let
A , A . Then m is null-additive.
Saks decomposition:
54
1.
m ( B )=0 ,or
then [25]
2.
1.5.4
Remarks:
1.
m ( B )=m( A) ,
For
1.5.5
Darboux property:
A ,
m ( A ) m ( A B ) m ( A )+
(resp.
m ( A ) m
55
holds.
1.5.6
Remark:
whenever
A , Bn , A B n= ( resp . Bn A ) m(B n) 0
, is called
56
X ={ x 1 , x 2 , x 3 , x n }
. A fuzzy measure on
X defined in chapter 1 section 1.4. Two fuzzy measures g and g* are called dual
fuzzy measures if and only if the following relation is satisfied:
g ( A )=1g ( A ) , A X .
Where
is the complement of A.
57
depends only on g and the ordering of the values of h, was the starting point
[21] to associate a set of n! probabilities to each fuzzy measure, in the following
way.
2.1.1
PA
defined by:
P A ( x a )=g ( { x a }) , P A ( x a ) =g ( { x a , x a })g ( { x a }) ,
1
P A ( x a )=g ( { x a , x a , x a x a })g ( { x a , x a , xa x a
i
P A ( x a ) =1g ( { xa , x a , x a , x a
n
For each
n1
})
i1
}),
(2a)
A=( a1 , a2 ,a 3 , a n ) n
|R S|
P A ( R )= (1 )
Bel ( S ) (2 b)
SR
( X )
is a
P A ( Y )=0
for all
Proof: Assume that Bel is a probability measure. For the empty set
, the
function
is given by:
P A ( { x } )=Bel ( { x } )
and
59
by definition of
PA
. Let
and assume
Y = {x 1 , x 2 , . x n }
additivity axiom:
P A ( R S )=P A ( R ) + P A ( S)
R , S ( X ) such that
R S= . We obtain,
x X , we have
Bel ( Y )= P A ( { x i } ) (2 c)
i=1
P A ( { x } )=1
xX
R , S (X ) such that
Bel ( R ) +Bel ( S )= P A ( { x } ) + P A ( { x } ) =
x R
x S
60
R S= , we have
x R S
P A ( { x } )=Bel ( R S )
According to above theorem, probability measures on finite sets are thus fully
represented by a function
function
is
usually
p= { p ( x ) : x X }
p: X [0,1]
called
such that
probability
p ( x )=P A ( { x }) . This
distribution
function.
Let
2.1.3
Basic
Mathematical
Properties
of
Possibility
Theory
and
Probability Theory:
1. Probability Theory: It is based on measures of one type:
Probability measure (P).
Possibility Theory: It is based on measures of two types:
(a) Possibility measures () ,
61
P ( A )= p (x)
x A
p (x)=1
xX
62
by a Possibility distribution
p ( x )=
1
| X|
for all
x X
7. Probability Theory:
Possibility Theory:
)=1
P ( A )+ P ( A
)1
( A )+ ( A
) 1
( A ) + ( A
max [ ( A ) , ( A ) ]=1
) ]=0
min [ ( A ) , ( A
63
[37]. The two distribution functions that represent probabilities and possibilities
become equal for this measure: one element of the universal set is assigned the
value of 1, with all other element being assigned a value of 0. This is clearly the
only measure that represents perfect evidence.
2.1.4
1. The theory of possibility is analogous to, yet conceptually different from the
theory of probability. Probability is fundamentally a measure of the frequency
of occurrence of an event, while possibility is used to quantify the meaning of
an event.
2. Value of each probability distribution are required to add to 1, while for
possibility distributions the largest values are required to be 1.
3. Probability theory is an ideal tool for formalizing uncertainty in situations
where class frequencies are known or where evidence is based on outcomes of a
sufficiently long series of independent random experiments. On the other hand
possibility theory is ideal for formalizing incomplete information expressed in
terms of fuzzy propositions.
4. Possibility measures replace the additivity axiom of probability with the
weaker subadditivity condition.
5. Probabilistic bodies of evidence consist of singletons, while possibilistic
bodies of evidence are families of nested set.
64
p ( x )=
1
| X|
for all
x X . In
define on
( )
both
condition.
65
2. Possibility theory and probability theory are suitable for modelling certain
type of uncertainty and suitable for modelling other types.
3. Notion of non-interactiveness on possibility theory is analogous to the notion
of independence in probability theory. If two random variables x and y are
independent, their joint probability distribution is the product of their individual
distributions. Similarly if two linguistic variables are non-interactive, their joint
probabilities are formed by combining their individual possibility distribution
through a fuzzy conjunction operator.
4. Possibility theory may be interpreted in terms of interval-valued
probabilities, provided that the normalization requirement is applied. Due to the
nested structure of evidence, the intervals of estimated probabilities are not
totally arbitrary. If
interval
the interval
[ Bel ( A ) ,1 ]
66
2.2
Discuss some variants of fuzzy subsets entropies and the linking relations of
these quantities with indicate a similarity of many important characteristics of
fuzzy subsets entropies and Shannons entropy.
A function which forms the basis of classical information theory measures the
average uncertainty associated with the prediction of outcomes in a random
[ 0, log 2|x|]
67
1.
H ( ) =0 ,
2.
H ( ) =log 2| x| ,
When
H ()
x X .
1
| X| for all
x X .
when
( {x })=
( { x })
rewritten as
H ( ) = ( { x } ) log 2 1 ( { y } ) (2 f )
x X
y x
68
x X , (2.6) can be
={ i } ={ 1 , 2 , . , n }
H ()
H ( M , M D ) where,
turns into
M ={ i i }= { 1 1 , 2 2 , .. , n n } ,
M ={ i i }= { 1 1 , 2 2 , . , n n } ,
D
i.e.
i + i =1 ,
~
xi
and
is its dual,
69
~
Where , Z ( X / )= i i log 2 i (2 h)
i=1
Z (~
X D / ) = (1i )i log 2 i( 2i)
i=1
And
~
Z ( X / )
~
~
Z ( X / ) + Z ( X D / ) =H ( )
(2j)
~
L ( X / )= i i log 2 i ( 2 k )
i=1
L(~
X D / )= (1i) i log 2 (1i )(2l)
i=1
Function
~
L ( X / ) is actually a kullback directed divergence
and
(1 i)
i log 2 i+(1 i)log 2
i
n
L(~
X / )+ L ( ~
X D / ) =
i=1
I ( M : ) [94]
Equality (2g) is Hirotos measure of uncertainty [3]. Notice that if we avail the
branching property in some other way then we rewrite (2g) in following form:
H ( M , M D )=H ( P (~
X ) , P(~
X D ) ) + P (~
X) H
i=1
i=1
M
M
+ P (~
X D) H
(2 n)
~
P( X )
P(~
X D)
( )
Where P (~
X )= i i ,P (~
X D ) = iD i (2 o)
In addition functions
~
Z ( X / )
~
L ( X / )
and
entropy [41].
~
~
~
EJ ( i , i )=Z ( X / ) + Z ( X D / ) + L ( X / )
(2p)
Formulas (2h) and (2j) represent particular cases of Zadehs and De Luca and
Terminis
~
Z ( X / )
and
~
L ( X / )
whose solutions are entropies; they are connected, with the usual
entropy of probability distribution as (2j) and (2m) with Shannons
71
is defined as
n
H ( M )= ( i i ) +1 log 2 ( i i ) =Z (~
X / M ) + L ( ~
X / M ) (2q )
i=1
~
+1
Where Z ( X / M )= ( i i ) log 2 ( i ) (2 r)
i=1
is a
entropy
of Zadeh and
~
+1
L ( X / M ) = ( i i ) log 2 ( i ) ( 2 s)
i=1
is the weighted
entropy
~
~
0 ~
0 ~
Z ( X / ) =Z ( X / M )L ( X / )=L ( X /M ) (2 t)
72
i loga i log
i=1
( )
i=1
i ai (2 u)
i=1,2, , n And which is also a basic inequality when one considers the
Shannons entropy.
2.2.1 Theorem: Let
function
m={ i }
~
X
and
= i
probability
distribution
on
i=1,2, , n then:
~
~
Z ( X / ) P ( X ) H
( PM(~X ) )(2 v)
~
Where P ( X )= i i . Equality holds when all i are equal .
i=1
i=
ii
P(X)
i
73
X,
i =1
n
i i
ii
i i
log
i
~
~ log
~ (2 w)
P(X)
P( X)
i=1 P ( X )
( )
i=
i i
~
P( X)
i.e. all
are equal,
and
(N )
N= {1, 2, 3, n }
be a discrete Choquet
normalized if
( N )=1
hence
FN
( S) can be interpreted
H ( )= r ( n) h [ ( S i )( S) ] (2 x)
i=1 S A
74
Where r=
( nr1 ) !
, ( r =0, 1,2, . , n1 ) ,h ( x )= x log x if x> 0
n!
0
if x=0
N= { 1 , 2 , , n }
suppose that
x1 , x2 , , xn R
on N and
1 , 2 , , n
75
xR
on N is defined by,
C ( x )= xi [ A i Ai +1 ](2 y)
i=1
x 1 x2 ... xn
such that
, also
A i= { 1 , 2 , , n }
and
A n +1=
of criteria, can be
on N is defined by,
n
H l ( )=
i=1 T N / { i }
|T| ( n ) h ( T { i } ) ( T )
Consider a variable V whose the exact value, which lies in the space
N= { 1 , 2 , , n }
76
S N
of values,
(S )
represents a measure
associated with our belief (or the confidence we have) that the value of V is
T N / { i }
|T |( n ) ( T { i } ) ( T )
, is defined by:
i ( )
H u ( ) = h
i=1
2.3
T N / {i }
|T |( n ) ( T { i } ) ( T )
]]
Here we analyze some basic properties of entropy of fuzzy measure. The main
properties of entropy of fuzzy measure, we have:
2.3.1
Continuity:
changing the values of the probabilities by a very small amount, should only
change the H value by a small amount.
2.3.2
( 1n , 1n , , 1n )=log n
on N,
More precisely,
H ()
78
H lH u
is
, we have
(i)
(ii)
H u ( )=log n
The second inequality and property (ii) follow from above property of the
Shannon entropy.
2.3.3
Additivity:
S= i=1 n S i then H ( S )= H ( S i ) .
i=1
Symmetry:
the sense that permuting the elements of N has no effect on the entropy. This
permutation
on
Hl
and
{ 1,2, , n } , we denote by
79
Hu
. For any
( S )= { (i |) i S }
on N defined by ( ( S ) )= ( S ) for all S N , where
Thus, permuting the arguments of the Choquet integral has no effect on the
degree to which one uses these arguments. We also have the uncertainty
associated with the variable V is independent of any permutation of elements of
N.
2.3.5
Expansibility:
Shannon entropy says that suppressing an outcome with zero probability does
not change the uncertainty of the outcome of an experiment [54]:
H ( 1 , 2 , n1 ,0 )=H ( 1 , 2 , n 1 ) .
This property can be extended to the framework of fuzzy measures in the
element for
80
k N
be a null
( T { k } )= ( T )
restriction of
for all
T N / { k } . Denote also by
the
k N
then,
H l ( )=H l ( )H u ( )=H u ( )
k
element
Whenever
2.3.6
Decisivity:
on N, we clearly have
H ( ) 0 . Now, the decisivity property for the Shannon entropy says that
there is no uncertainty in an experiment in which one outcome has probability
one [54]:
H (1,0, , 0 )=H ( 0,1, , 0 )= =H ( 0,0, .,1 )=0 .
81
More precisely,
Hl
and
Hu
is a
, we observe
H l ( ) 0H u ( ) 0
. Moreover,
(i)
(ii)
is a Dirac measure.
C ( x ) {x 1 , x 2 , , x n } , ( x R n ) .
In other terms,
Hl( )
2.3.7
Increasing Monotonicity:
Let
by
=+ ( ) [ 0,1 ] .
82
F A / { }
and define
F A
0 1 < 2 1
, we have
H M ( )< H M ( ) .
1
HM
equation,
H M ( )=
1
H ()
n!
Strict Concavity:
For any
1 , 2 F A
and
(0,1) , we have
H M ( 1 + ( 1 ) 2) > H M ( 1 ) + ( 1 ) H M ( 2 ) .
FA
HM
For probability distributions, the strict concavity of the Shannon entropy and it
naturalness as a measure of uncertainty gave rise to the maximum entropy
principle, which was stated by [52] as follows:
83
When one has only partial information about the possible outcomes of a
random variable, one should choose its probability distribution so as to
maximize the uncertainty about the missing information.
In other words, all the available information should be used, but one should be
a uncommitted as possible about missing information. In more mathematical
terms, this principle states that among all the probability distributions that are in
accordance with the available prior knowledge (that is as set of constraints), one
should choose the one that has maximum uncertainty.
2.4.
The study of different concepts of entropy will be very interesting, and not only
on physics, but also on information theory, and other mathematical sciences as
fuzzy measures, considered in its more general vision. Also it may be a very
useful tool for bio-computing, or in many others, such as studying
environmental sciences. This is because, among other interpretations with
important practical consequences, the law of entropy means that energy cannot
be fully recycled. Many quotations have been made until now referring to the
content and significance of this fuzzy measure, for example:
Gain in entropy always means loss of information, and nothing more. [40].
Information is just known entropy. Entropy is just unknown information.
[71].
84
85
86
3.1
[79], if
N (T )= f dM T I (3 a)
T
Defines a measure N on
(X , I )
whenever T I
and
>0
there exists
M ( T ) < .
87
>0
such that
N (T ) <
( A )= f d A I (3 d)
A
moreover if
In general let
is finite, then
(X , I)
(X , I )
(X , I ) . The absolute
N S M
of type
and
S {1,2,3,4, .. ,9 }
on
Definition:
88
(1)
N 1 M
iff
N (T )=0 : T I ,
(2)
N 2 M
iff
N (T )=N ( U ) :T , U I ,T U M ( T )= M (U )< .
(3)
N 3 M
iff
M ( T )=0 .
N ( T n ) 0 : { T n } I , {T n }
is
non-increasing,
M (T n ) 0 .
N 4 M
(4)
iff
N ( T n ) N (T ) :T I , {T n } I , T n n T n T , M (T n) M ( T )<
N 5 M
(5)
iff
N ( T n ) N (T ) :T I , {T n } I , T n n T n T , M (T n) M (T )<
(6)
N 6 M
iff
N ( T n ) 0 : {T n } I , M ( T n ) 0 ;
(7)
N 7 M
N (T ) < , T I , M ( T ) < .
iff
N ( T n ) N (T ) :T I , {T n } I , T T n , M (T n) M (T )<
89
>0
N 8 M
(8)
iff
N ( T n ) N (T ) :T I , {T n } I , T n T , M (T n) M (T )<
N 9 M
(9)
iff
N ( U n )N ( T n ) 0: { T n } , { U n } I , T n U n , M ( U n )M ( T n ) 0 ;
any
> 0 there
exists
> 0 such
or iff for
that
N (U )N ( T ) < :T , U I , T U , M ( U ) M ( T )< .
Where
n={1,2,3, }
Definition:
90
(1)
N (U T )=N ( U ) :
T , U I , M ( T )=0 ;
N (U T )=N ( U ) : T , U I , M ( T )=0 ;
or
iff
or
iff
or
iff
N (U T )=N ( U ) :
T , U I , M ( T )=0 .
(2)
N (T )=0:T ,U I ,
T U = , M ( U T )=M ( U ) < ;
or
or
iff
iff
N (T )=0:T ,U I , T U , M ( U T )= M ( U ) < .
(3)
N 4a M
N ( T n ) N (T ) :T I , {T n } I , T n n T n T , M ( T nT ) 0 ;
iff
or
N ( T T n) N ( T ) :T I , { T n } I , { T n } is nonincreasing , M (T n ) 0
91
iff
N 4b M
(4)
iff
N ( T n T ) 0 :T I , { T n } I , T n n T n T , M ( T n ) M (T ) < ;
or iff
N ( T n ) 0 :T I , { T n } I , { T n } is nonincreasing ,
T n T = , M ( T T n ) M ( T ) < .
N 5 a M
(5)
iff
N ( T n ) N (T ) :T I , {T n } I , T n n T n T , M ( T n T ) 0 ;
or
N ( T T n) N ( T ) :T I , { T n } I , { T n } is nonincreasing , M (T n ) 0
(6)
N 5 b M
iff
iff
N ( T n ) N (T ) :T I , {T n } I , T n n T n T
M ( T n ) M ( T )< ;
or
iff
92
(7)
or iff
N 7 a M
iff
N ( T n ) N (T ) :T I , {T n } I , T T n , M ( T n T ) 0 ;
N ( T T n) N ( T ) :T I , { T n } I , M (T n)0
N 7 b M
(8)
iff
N ( T n T ) 0 :T I , { T n } I , T T n , M ( T n ) M ( T ) < ;
or
N ( T n ) 0 :T I , { T n } I , T n T = , M ( T T n ) M ( T )<
(9)
or iff
(10)
N 8a M
iff
iff
N ( T n ) N (T ) :T I , {T n } I , T n T , M ( T T n ) 0 ;
N ( T T n ) N (T ) :T I , {T n } I , M (T n ) 0
N 8b M
iff
N ( T T n ) 0 :T I , { T n } I , T n T , M ( T n ) M ( T ) < ;
N ( T n ) 0 :T I , { T n } I , T n T , M ( T T n ) M ( T ) <
93
or
iff
(11)
N 9a M
N (U )N ( T ) < :T , U I , T U , M ( U T ) < ;
(12)
N 9b M
>0
such that
|N ( U )N ( T )|< :T , U I , M ( U T ) <
> 0
such that
T
.
N (U T )< :T , U I ,T U , M
Theorem: Let M and N be two finite fuzzy measures such that they are
94
>0
and a sequence
{T n }
from
such
that
1
N ( T n ) < M ( T n ) >
n
where n={1,2,3,.........}
the sequence
{T n }
such that
i= p
1
N ( k T n ) < for p=1,2,3 . k (3 f )
p
i
i=p
lim N ( T n ) =N ( T n ) (3 g)
p
95
(3e)
{T n }
k
of
i= p
i= p
N ( T n ) =lim N ( T n )
i
1
p
i= p
p=1
i= p
N ( T n )=0 M ( T n )=0
i
On the other hands, we obtain by the continuity from above and continuity from
below of the fuzzy measure M and equation (3e)
p=1
i= p
i= p
i= p
T
(
M ( T n ) = lim M
n ) =lim lim M ( T n ) M ( T n ) >
i
p k
3.2
A set function
( A B )= (A )
whenever
A , B I , A B= ( B )=0
96
[28]. Some
:I [0, ] is said to be
(1)
Weakly
null
additive,
( A B )=0
whenever
( AB )=0
whenever
( B C )=(C)
whenever
if
A , B I , ( A )= ( B ) =0 .
(2)
Converse
null
additive,
if
A I , B A I , ( A )= (B) .
(3)
Pseudo
null
additive,
if
A I , B ,C A I , ( AB ) =( A) .
(4) If
is pseudo null
sequence
{ Ai }
of
is said to be
pairwise
disjoint
( A i )=0,i=1,2,3
97
sets
from
and
( i=1 A i )= ( A ) A I
(6)
is
( A i )=0,i=1,2,3
implies
i=1
A i ) =0
For the case of non additive measure theory, the situation is not so simple.
There are many discussions on Lebesgue decomposition type theorem such as,
a version on submeasure [82], a version on
a version on
is said to be,
( 1 ) Exhaustive if
98
:I [0, ]
( 2 ) Order continuous if
A ( A ) =0 [ 107 ] .
, and denote
>0
say that
there is a
> 0
is absolutely
such that
and
every
. Let
is
then we
. Now we
define a theorem,
3.2.1
Theorem: Let
continuous from above and continuous from below. Then there exist a set A
99
from
such
that,
( A ) ={ ( p) : P I } ,
( P A )=0 ( P )= ( P A ) , P I .
Let
A 0=
, we take A1 from
such that,
( A 1 ) ={ ( p) : P I }
. We choose A from
2
such that,
( A 2 ) ={
( p ) : P X A1 }
( p ) : P X i=0
( A n ) ={ n1 A , P I } (3 h)
i
holds. We take
A=i=0 Ai
100
{ An } such that,
Then by the construction equation (3h) holds. The continuity from above of
implies
P i=0
lim ( n Ai ) = ( P A )( 3i)
( P A )=0 .
P
( P A ) ( ) =( P A)
( P )=
Hence proved.
3.3
Let
where
and
=c + s
101
, respectively then
, denoted by
, defined by if there
( P A )= ( P )=0 P I
too. Now
Theorem: Let
A I
Proof: Let
sequence
such that
( P )= (P A)
for any
P I .
={ ( P) : P I } , by the definition of
{ Pn }
1
from
, we can choose a
102
1
< ( Pn )
n
1
1
< ( Pn ) ( A1 )
n
1
Let
from
( A1 ) =
we have
such that
Pn X A1 , n 1
2
And
( A2 ) ={
( P ) : P I , P X A 1}
Let
A= A 1 A2 then A I
and
= ( A 1 ) ( A ) { ( P ) : P I }
103
{Pn }
2
Therefore
noting
( A )= ( A 1 )=
A 1 A 2=
, and
( A2 ) = ( AA 1 )=0
, we have
I , it is follows from,
P A P A 1 X A 1
( P A ) ( A2 ) =0
P I , we have
( P A )=0
when
is null additive,
then
( P )= ( ( P A ) ( P A ) )= ( P A ) for any
is now complete.
Similarly we have if
there is a set
A I
additive, then
( P )= (P A)
3.3.2
Theorem: Let
I , then
such that
for any
is null-
P I
and
I ,
such that
c ( P )= (P A)
s (P)=(P A)
and
for a set
, and
A I
and
are
to .
ring
A I1
I.
I 1 ={ P I : ( P )=0 }
is a
( P A )=0
such that
and
sub-algebra of the
( P )= ( P A)
on
for
I1
has a set
P I 1
. We
take
c ( P )= ( P A ) s ( P )=( P A)
for each
c s
3.3.3
Theorem: Let
measures on
and
105
such that
c ( P )= (P A)
are absolutely
and
s (P)=(P A)
for a set
and
A I
and
are
, and
by theorem 3.3.1
and
is absolutely continuous.
3.3.4
Theorem: Let
and
additive measure
and
c ( P )= (P A) and
defined by
s ( P )= ( P A ) P I
106
I . If
Satisfy
, and
are
by
I 1 ={ P I : ( P )=0 }
is a
sub-ring of
A I
by using theorem
such that,
( A ) ={ ( P ): P I } ( PA )=0 P I
c s
c ( P )= (P A) and
s ( P )= ( P A ) , P I , then
107
(1) Let
additive,
and
I . If
is super-
and
on
such that
c 1 , s c + s .
(2) Let
and
c 1 s
(3) Let
and
is
measures
and
and
and
is either
on
such that
additive,
(4) Let
measure
is
I . If
I . If
is super-
on
such that
c 1 , s c + s .
I . If
is strongly
108
A I
and
defined by
c ( P )= (P A) and
Satisfy
c 6 s
(5) Let
additive,
A I
and
, respectively.
c 1 s
Similarly if
I . If
is pseudo-null
c ( P )= (P A) and
Satisfy
s ( P )= ( P A ) P I
and
defined by
s ( P )= ( P A ) P I
, respectively.
exhaustive.
3.4
109
f : In I
mapping
w={w1 , w2 , wn }
with,
n
f ( a1 , a 2 , .. an )= w j b j (3 j)
j=1
where
bj
ai
bj
. Aggregation is
additive measures,
bridging the gap between symmetric fuzzy measures and general fuzzy
measures. Choquet integral with respect to a m-symmetric fuzzy measure
generalizes the concept of OWA. Another generalization of OWA operators can
be found in [98], in which it is defined the double aggregation operators as an
aggregation of two other aggregation operators. If we deal with a space of n
110
measure needs
2n
-measures [72],
|X|=|Y | ( X )= ( Y ) X ,Y A
where
A={a1 , a2 , ., a n }
ai , a j
aia j
are said to be
This concept can be generalized for subsets of more than two elements i.e. Let
X be a subset of A then X is a set of indifference iff,
111
|X 1|=|X 2| X 1 , X 2 X
( X 1 Y )= ( X 2 Y ) Y A X (3 k )
3.4.1
|X 1|=|X 2| X 1 , X 2 X
( X 1 Y )= ( X 2 Y ) Y A( X 1 X 2 )
Proof: For Y AX
( X 1 Y )= ( X 2 Y ) .
Let us consider
Z X (X 1 X 2 )
Y A(X 1 X 2)
such that
but
Y =Z Y 1
, with
Y 1 AX
. Thus from
112
indifference. A subset
XA
if,
( X Y )= ( Y ) , Y AX
X1 , X2 X
such that
|X 1|=|X 2| for Y AX
( Y ) ( X 1 Y ) ( X Y )= (Y )
( Y ) ( X 2 Y ) ( X Y )= (Y )
Then,
( X 1 Y )= ( Y )= ( X 2 Y )
{X 1 , X 2 }
113
X1 , X2
X1
and
X2
Let
X ={ X 1 , X 2 , . , X m }
and
Xi
Y ={Y 1 , Y 2 , ., Y r }
there exist
Yj
be two partition of
such that
Let
{ X 1 , X 2 , . , X m } , X i i=1,2, . , m .
X 2={x k+1 , x k +2 , , x n }
with
X1
X2
X 1= { x1 , x2 , , xk }
singletons and
( x 1 , x 2 ) , ( xk +1 , x k+2 ) , ( x 1 , x k+ 1)
and
( x 1 ) , ( x k+1 )
for
3.4.3
Theorem: If
X ={ X 1 , X 2 , . , X m }
partition
then for
Y =(Y 1 ,Y 2 , . ,Y m ) A , we
have,
m ( Y 1 , Y 2 , , Y m ) =
Proof: Let
i1
(1 )Y ++Y
1
i1im
i1 Y 1 , ,im Y m
( )( ) ( )
Y1 Y 2
Y
m ( i1 , , im )
i 1 i2
im
element of
X1
Y1 Y2
Y
m
i1 i2
im
( )( )
( )
i2
element of
X2
,.....,
im
element of
Xm
is
m ( Y ) = (1 )|Y ||X| ( X ) ,
X Y
is defined by,
m ( X )= (1 )|X Y | ( Y ) X A .
Y X
We obtain,
115
be a fuzzy measure on A
m ( Y 1 , Y 2 , , Y m ) =
(1 )Y ++Y
1
i1im
i1 Y 1 , ,im Y m
3.5
( )( ) ( )
Y1 Y 2
Y
m ( i1 , , im )
i 1 i2
im
We have considered the learning of two dimensional fuzzy measures for data
modelling using the Choquet integral. Due to the fact that there are no public
repositories with files for learning aggregation operators or information fusion
models, we have used two examples taken from the machine learning repository
[77], the iris and the abalone data files. These data files already used in
[102]. The iris" data file consists of 150 examples each one with four input
variables and one output variable. The variable class was recorded because it is
a categorical one. The abalone data file consists of 4177 examples each one
with 8 input variables and 1 output one. The variable Sex was recorded because
it is a categorical one.
For the iris data file the results obtained with our approach are not much
relevant. Only a small improvement has been obtained (2.64903740 was the
best distance achieved using a one dimensional distorted probability and
2.64903728 is our best result with a two dimensional one). From our point of
view, this small improvement is mainly due to the fact that two of the four
variables (x1, x4) result in probabilities equal to zero in the 26 model based on 1
116
dimensional distorted probability and the same occurs for the 2 dimensional
models.
For the abalone data file, the best results with a one-dimensional distorted
probability had a distance of 37.4931. This is achieved when four of the
variables (x2, x5, x6 and x7) had a probability equal to zero. The probabilities of
the variables are given in Table 3A. We have considered all partitions for
variables with non-null probability and the best distance is given in Table 3B. In
such partitions, null variables have also been included in one of the sets. The
results show that in all cases, the best 2 dimensional DP also assigns
probabilities equal to zero to such null variables.
Table 3A: Probability distribution for the variables in the abalone data file.
x1
0.018
x2
0.0
x3
0.369
x4
0.524
x5
0.0
x6
0.0
x7
0.0
Table 3B: Partitions considered with the abalone data file and minimum
distance achieved.
Partition
{ x 3 , x 4 , x 8 } , { x 1 , x 2 , x 5 , x 6 , x7 }
Minimum distance
37.3569
{ x 4 , x8 } , { x 1 , x 2 , x 3 , x 5 , x 6 , x7 }
37.2745
{ x 3 , x 8 } , { x1 , x2 , x 4 , x 5 , x 6 , x7 }
37.1682
{ x3 , x4 } , { x1 , x2 , x5 , x6 , x7 , x8 }
37.2751
{ x 8 } , { x 1 , x 2 , x3 , x 4 , x 5 , x 6 , x7 }
37.2866
117
x8
0.088
{ x4 } , { x1 , x2 , x3 , x5 , x6 , x7 , x8 }
37.2634
{ x 3 } , {x 1 , x 2 , x 4 , x 5 , x 6 , x 7 , x 8 }
37.2810
Tables 3C and 3D give the probability distributions p1, p2 and the function F for
the best solution obtained. This corresponds to the partition defined by the sets
{x3, x8} and {x1, x2, x4, x5, x6, x7}. Its best distance was 37.1682. Table 3C
corresponds to the measure before the iterative process described in this section
and Table 3D corresponds to the measure after the iterative process. The results
show that the probability distributions p1 and p2 have changed distributing part
of the probability. In particular, in the dimension {x 1, x4} the former has lost
importance and has been acquired by the latter. Similarly, the function F has
also been modified.
Table 3C: Initial probability distributions p1 and p2 for the partition defined by
{ x 3 , x 8 } and { x 1 , x 2 , x 4 , x5 , x 6 , x 7 } .
p1
p2
p2 ( x 3 ) =0.806, p2 ( x 8 )=0.193
118
Table 3D: Final probability distributions p 1 and p2 for the partition defined by
{ x 3 , x 8 } and { x 1 , x 2 , x 4 , x5 , x 6 , x 7 } .
p1
p2
p2 ( x 3 ) =0.981, p2 ( x 8 )=0.018
Fuzzy measures and integrals [67] are important tools for comparing classical
or fuzzy sets with respect to their size. Usually fuzzy measures are set functions
defined on some algebra of sets that are monotone with respect to inclusion and
that assign zero to the empty set.
If
I
denote the
algebra of the
ring
I , and let
measurable subsets of A, X
measure
f : A X
or
119
f (C ) I
(or,
f (O) I .
integrable set
T I
set
N T
Let
on
H X
, there exist a
be a
negligible
such that f ( T N ) XH .
measurable functions defined
T I
{ f n } M [ A ] , f : A X
and
{ f n } converges to f almost
{T m }
of sets of T I
such that
And
{ f n } converges to f uniformly on T m
(m=1,2, ) .
120
4.1
In mathematics a Borel set is any set in a metric space that can be formed from
open sets or from closed sets through the operations of countable union,
countable intersection and relative complement. Borel sets are named after
Emile Borel [7]. For a metric space X, the collection of all Borel sets on X
formed a
- algebra.
or all closed sets [82]. Borel sets are important in measure theory, since any
measure defined on the open sets of a space or on the closed sets of a space,
must also be defined on all Borel sets of that space.
We assume that X is a metric space, and that O
sets in X. Borel
- algebra
is the smallest
ring containing
O , and unless stated otherwise all the subsets are supposed to belong to
B . We shall denote by
C
A set function
:B [0, ] is said to be
121
(a) exhaustive if
( A n ) 0
{ An } of
B ;
lim ( A An ) = ( A ) A ;
n
such that,
( A ) < ( A)<
where
defined on B .
122
>0
and
there exists
= ( ) >0
such that
Let
= ( ) >0
there exists
( A B ) ( A ) + ( AB ) ( A ) whenever ( B ) ; and
(i) null-additive if
every
>0
( A B )= ( A )
( B ) =0 . For
A , B , An B , n=1,2, .
(X , B , )
( A ) >0
is
called
an
atom of
if
B A
implies
(1)
that all the mass of the atom is concentrated on a single point in it [110]. This
fact makes calculation of the fuzzy integral over an atom and a finite union of
disjoint atoms easy. Now we discuss some theorems which concerning some
properties of fuzzy measures on metric space. If a fuzzy measure
then
is finite,
is exhaustive, then
is
4.1.1
be a nullA
such
that
(1)
Moreover,
( B ' ) =0 , then
x X
A B
Proof: Let
G1 , G2 ,
such that
n=1 Gn= { G :G H }
and
A =X G
, we have
n=1
( G )
= ( G n ) =0,i . e .
( A ' )=0.
124
. By the null-additivity of
X B G
, namely
A B
X B H
( B ' ) =0 ,
. The uniqueness of
and hence
is obvious. Hence
this proves the both assertion. Now there exists an open set containing x for any
x A
such that
( G ) =0
and
(G ) must be positive if
x A
and
is called
that
spectrum
A B
B X . Then
has a
. We shall investigate a
smaller class of fuzzy measures on metric spaces, i.e. tight fuzzy measures.
Tight fuzzy measures have the property that they are determined by their values
taken on compact sets [81]. A fuzzy measure
125
each set
that
A B
and each
( AK )
such
has separable support and for any Borel set A and any > 0
K A
Proof: Let
K A
on X. Then
Kn
such that
( AK ) <
1
( XK n ) < , n=1,2,
.A
n
such that
( XK )<
126
K A ,
1
a
and centre
X n=1
n=1 Ca ( x n ) X ,i . e . [ C a ( x n ) ]
As
X n=1
( C a ( xn )) 0
( b C a ( x n ) ) < a+1 (4 a)
2
Taking Ca = n=1 b C a ( x n )
Let
{C a }
i
127
i=1
X i=1
( ( X C a ) ) = ( Ca ) =0( 4 b)
i
Let K=i=1 C0
Since
Ca
autocontinuity of
(E F ) ( E)+
whenever
( AK ) <
K =K C
, then
Let
. By the uniform
( AC ) < .
128
K A
. Thus we have,
( AK ) ( ( XK ) ( AK )) ( XK )+ +
2 2 2
4.1.3 Theorem: Let X be a separable metric space with the property that there
exists a complete separable metric space Y such that X is contained in Y as a
topological subset and X is a Borel subset of Y. Then every uniformly
autocontinuous fuzzy measure
class
BY
by setting
on
BX
( A )= ( A X ) ,
, we define
A BY
( X K ) <
on the
X BY
. Since
is a tight measure
K X
> 0 , a set
, by theorem 4.1.2.
is
129
( XK ) = ( X K ) <
assume that X is itself a complete separable metric space. For the rest part we
prove in above theorem.
Let a fuzzy measure
is tight. Moreover, if
, then
( A ) ={ ( K ): K A , K K } for each A B .
Let
X =(0, 1) and B
( A )=tan
Where
( m(2 A) )
A B
is
( X )= .
130
f ( A) B , there are Borel sets A1 and A2 on the real line such that
that
[106],
A1 A A2
and
( f 1 ( A 2A 1 ) )=0 .
( X , B , )
a uniformly
is a perfect fuzzy
measure space.
Proof: Let f be any real valued measurable function. It is sufficient to prove
A1 A
A A2
such that
with
and
f 1 ( A) B , there
( f 1 ( A A1 ) )=0 ; A be defined as a
2
( f 1 ( A 2A ) )=0
K 1 K 2 . ..
continuous, and
, each Kn is compact,
(X K n ) 0 ,
131
f K n
(f restricted to Kn) is
(2)
C1 C2 . . B
If we write
f Q n
Qn=K n C n
, then
is continuous and
( BC n ) 0 .
Q1 Q2 . .. B
( BQ n ) 0
as
, each Qn is compact,
n> . If
f Q n
B n=f (Qn )
is continuous and
hence
A 1= n=1 Bn
it follows that,
f
( A1 ) n=1 Qn
Clearly,
A1 A
and
obtain
B n=1
K n ) =lim ( BK n ) =0
n
132
we
so
and
> 0 , there exists a closed set S and an open set T such that
P(T S)
additive fuzzy measure and proved Egoroffs theorem and Lusins theorem for
fuzzy measures on a metric space. The Egoroffs theorem and Lusins theorem
in the classical measure theory are important and useful for discussion of
convergence and continuity of measurable functions. The Egoroffs theorem for
a fuzzy measure space was proposed by [110], [106], but there the finiteness of
fuzzy measures was assumed. The Egoroffs theorem and Lusins theorems
hold for those fuzzy measures that are defined on metric spaces and supposed to
be exhaustive and auto continuous from above. It will be proved that
exhaustivity and autocontinuity are sufficient for a fuzzy measure to have
regularity and tightness, which are enjoyed by classical measures. We define the
strong regularity of fuzzy measures and show our main result: the null-additive
fuzzy measures possess a strong regularity on complete separable metric
spaces. By using strong regularity we shall show a version of Egoroffs theorem
and Lusins theorem for null additive fuzzy measures on complete separable
metric spaces.
133
B . If
regular. Furthermore, if
from above on
, then
is also
inf { ( T ) :T A , T O }
For each
B . If either
and
( S )=(S)
for all
S C
or
Let
such that
K A T
and
K K
(T K )< .
134
A B
( A ) ={ ( K ): K A , K K }
then
is strongly regular.
any
algebra
A B
( A ) ={ ( S ): S A ,S C }
inf { ( T ) :T A , T O }
Proof: Let
there are
S C
To prove
and
T O
A B
satisfying
S A T
T n= x :d ( x , G ) <
1
n
decrease to G, i.e.
we have
135
T n G
and
>0
(T S) .
. Since
is exhaustive,
lim ( T nG )=0
Thus
C , now let
{ A n }
and
>0
( A ) ( B ) ( A B) .
Hence there exists two sequences of closed sets {S n} and of open sets {Tn} such
that
n=1
S n A n T n n=1,2, ( ( T n S n ) )
( S n n=1 m Sn ) (as m )
n0 1
such that
n=1
( S n n=1 n0 S n ) .
136
n=1
n=1
(T S ) ( ( ( T nS n ) ) ( Sn S ) )
Thus
=B . Now let
A B
and
we prove,
( A )=inf { ( T ) :T A ,T O }
n 1 , there exists
for each
( T n A ) <
T n O
such that
A T n
and
1
n . Thus we have
n=1
( T nA ) =0hence
n=1
i=1
(
T
)
( A )=
n =lim ( n T i ) since i=1 nT i
n
is also open and contains A, the equation above to be proved is obtained. The
other equation is proved similarly.
137
>0
B n ,m ( m ) , n=1,2, ..
satisfying
{ Bn , m }
n
of
( B n ,m
n=1
) < ( m1 <m 2 < .)
as
( B 1,m ) <
1
138
, that
> 0 , using
m2> m1
m1 , m2 , .. , mi
exists
, such that
{ Bn , m }
n
2 . Generally, there
such that
, we have
n=1
( B n ,m ) < .
2
n
Proof: Let
A B
is null-additive, then
and given
is strongly regular.
is regular. Therefore, there exists a sequence {Sn}, n=1,2,.... of closed sets and
139
n=1,2, .. , S n A T n
( T n S n ) <
{Sn }
1
n
{T nSn }
{T n }
is decreasing in n. Thus
T nS
n=1
( n) ( ( T nS n ) ) .
Let F1 = n=1 ( T n S n )
1
( F1 ) ( T nS n )< n=1,2, .. then ( F1 ) =0.
n
{ K n}
n=1,2, .. , ( XK n ) <
{ K n } is decreasing in n. Therefore as n
( X K n ) n=1 ( XK n ) .
Let F2 = n=1 (X K n ), then ( F 2 )=0.
140
1
n
of
and we
F
( 1 F2 )as n
Thus we have,
, noting that
[ ( X K n ) ( T nSn ) ]
1
F 2)=0 by the continuity of
(
, then
lim ( ( X K n ) ( T n S n ) ) =0
n0
such that
Let
K =K n S n T =T n then K
and
K A T
( ( XK n ) ( T n S n ) )< .
0
is an open set,
T K ) (( XK n ) ( T n S n ) )<
. Since (
0
is strongly regular.
that if
f n f
(X , B , m) , then for
141
uniformly on
such that
( XS ) <
> 0 there
>0
there
( A ) ( B ) ( A B) . Put
1
k=1,2,
k
then En,k is increasing in n for each fixed k. The set of all those xs, for which
f n f
is
n=1
k =1 E n ,k .
142
Since
f n f
( X En , k )
equivalently
En , k X as n
as n
k 1 or
k 1 . For the
for any
for any
{ E n ,k }
of
>0
{ En , k : n 1, k 1 }
such that
k=1
( ( X En , k ) ) . Put S= k =1 E n ,k ,then ( XS )
k
{ f n } converges f
such that
( SS )
uniformly on
. Then
( XS )
and
of S
{ f n } converges to f
Similarly we can prove the following theorem. Let the metric space X be
complete and separable, and a fuzzy measure
f n f
143
>0
converges to f uniformly on
such that
( XK ) <
and
{f n}
finite fuzzy
measure space.
4.2.7 Theorem: Let a fuzzy measure
continuous on
>0
there exists
be
and
( XS )
>0
>0
S C
such that f is
such that
( A ) ( B ) ( A B) . To
>0
is
f ( x )= ai X A (x)
i=1
Where ai a j , Ai A j = , ( i j ) , Ai B X= i=1 n A i .
By the regularity of
(n
i=1
( A iS i ) ) .
i=1
(X S ) ( n ( A iSi ) )
since the distance of two disjoint closed set is greater than zero, it is obvious
that f is continuous on
145
1> 0
and
2> 0
to
where
0=
such that
n=1
( ( X S n ) ) 2
then
( XS 0 ) 2
is
( Xm)<
where
(X X m ) 2
m=1,2, . .Then
and thus
We have a subset F of
and
( X ( S0 X m ) ) 1
0
( S0 X m )
( ( S 0 X m )F ) 2
there
exists
m0
X m X
such
that
146
, there is a closed
subset
( FS ) 2
of F such that
Thus f is continuous on
, and then
( ( S 0 X m )S ) 1
0
, and we have
( XS ) ( ( X( S 0 X m ) ) ( ( S0 X m )S ))
0
|f |f
,
2
|f |+ f
+=
f
2
f
=
then
+
f
and
+f
.
f =f
f
and f , we have two closed subsets
147
+
S
on
and
of X such that
S and S , and
+
f
and
+
X S
X S .
+ S
Thus f is continuous on the closed set
and
S =S
+
X S
( XS ) =
algebra
be the class of
148
and each
A T , and
such that
(T A ) < .
A fuzzy measure
S C
A B
such that
and each
S A , and
( AS ) < .
A fuzzy measure
S A T
and
S C
A B
and each
T O
such that
(T S )< .
In the following we present some properties of the inner regularity and outer
regularity of fuzzy measure.
4.3.2 Theorem: If
149
for any
Si
>0 ,
in C
>0
such that
Si Ai
and
( A iSi ) <
j=1
AS= i=1 n n ( A iS j ) i=1 n ( AiS i ) ,
Then by
is an auto-continuous we have
i=1
( AS ) ( n ( AiS i ) ) < .
(2) Let
>0 ,
( T i Bi ) <
150
Ti
in
j =1
T B= i=1 n T ii=1 n Bi= i=1 n n ( T i B j ) i=1 n ( T iBi ) ,
Then by
is an auto-continuous we have,
i=1
(T B ) ( n ( T iBi ) ) <
(3) Let
any
Ti
>0 ,
>0
B i T i
in O , such that
and
( T i Bi ) <
j =1
T B= i=1 n T ii=1 n Bi= i=1 n n ( T i B j ) i=1 n ( T iBi ) ,
Then by
is an auto-continuous we have,
151
i=1
(T B ) ( n ( T iBi ) ) <
4.3.3 Theorem: If
for any
set
Ti
>0 ,
B i T i
in O , such that
and
( T i Bi ) <
j=1
T B= i=1 T i i=1 Bi=i=1 ( T iB j ) i=1 ( T iBi ) ,
Since
i=1
(T B ) ( ( T i B i) ) <
152
(2) Let
>0 ,
Si
> 0
in C
such that
Si Ai
and
( A iSi ) <
j=1
AS= i=1 Ai i=1 Si= i=1 ( A iS j ) i=1 ( A iSi ) ,
since
i=1
( AS ) ( ( A iSi ) ) < .
4.3.4 Theorem: If
153
is closed and
ST B , then
ST C . Thus
ST
is outer regular.
and
( H( ST ) ) < .
Since T =[ S(ST ) ] ( SH ) C
then
( T ( SH ) ) = (T H ) ( H( ST ) ) <
154
C ,
be a
G C , such
G(T S )
that
and
( ( T S )G ) < .
Since
( ( T G )S ) = ( ( TS )G ) < .
The other properties of the inner\outer regularity of fuzzy measure are defined,
which can be proved easily by the help of above theorems are,
(a) If
(i) If
( A ) ={ ( S ): S A ,S C } for all A B .
(ii) If
(c) If
155
(i) If
is continuous from below and strongly order continuous and for any
A B ,
( A ) ={ ( S ): S A ,S C }
then
(ii) If
is inner regular.
is continuous from above and for any
A B ,
( A )=inf { ( T ) : A T ,T O }
then
is outer regular.
156
atomic fuzzy measures. In this case the uniform auto-continuity can be replaced
with the null-additive.
Let X be a non empty set,
functions are assumed to be monotonic and equal to zero on the empty set.
5.1.1
lim ( A n )=0;
n=1,2, .
(d) null-additive if
157
( B ) =0 .
such
such that
= ( ) >0
( A B ) ( A ) + ( AB ) ( A ) whenever ( B ) ;
such that
there exists
>0
there exists
>0
( A ) ( B ) < ( A B )< ;
absolute continuity of
with respect to
if
is usually denoted by
.
(k) weakly absolutely continuous with respect to
158
if
( A )=0
whenever
(m) sub-additive if
, if for any
and it is denoted as
>0
there
( A B ) ( A ) + ( B ) A , B ; and
n=1
( A n ) ( An ) .
n=1
For every
5.1.2
A , B , An n=1,2, .
lim ( A n )=0
{ An }
i
of {An},
n 1, i 1 such that,
lim ( A n ) =0 .
i
5.1.3
159
lim [ ( A n ) ( Bn ) ]=0
{ An }
i
and
{ Bn }
i
such that
lim ( A n Bn )=0.
i
If
property.
A fuzzy measure
is said to be finite if
( A ) <
for any A;
such that
( A n ) < n 1
{ An } n 1
and
A= n=1 A n .
160
is both
and
(2)
Let
have
the
( B ) =0 ( A B )= ( A)
only if
(3) Let
that
and
if and only if
property
for any
and
be
be null-
exhaustive.
A , B , then
If
if and
is exhaustive.
( AH )=0 and
such
A .
161
Now we investigate the auto-continuity of fuzzy measures by using the pseudometric generating property and absolute continuity of set functions as shown in
[109].
5.1.4
Theorem: If
Proof: Let
lim [ ( A n ) ( Bn ) ]=0
{ An }
i
of {An},
i=n+1
1
( n+ k A n ) n , k 1
n
i
n 1
for any > 0 , we can find 0
such that
i=n 0+ 1
( A n ) .
i
j=n0 +1
lim
( A n Bn )
i
lim
i
( ( An A n ) )
j
162
n 1, i 1 such that,
by the auto-continuity of
{ An }
sequences
lim
and
{ Bn }
i
such that
( A n Bn ) =0
so that
5.1.5
Theorem: Let
be null-additive. Then
continuous if and only if it is monotonically continuous and has the pseudometric generating property.
Proof: Let
Then
and
( B )= ( A B ) ( A ) B .
( A ) < . Put
( A )=0 ( A B )= ( A)
and
A . Since
is a monotonically continuous
163
for any
is auto-continuous
such that
if and only if
is exhaustive
Then,
that
there exist
is exhaustive by
. Now we prove
1> 0
and
2> 0
>0 ,
and
( A ) ( B ) < 2 ( A B ) < 1 .
there exists
>0
such that,
( A ) ( B ) < ( A B )< .
164
part is valid.
Conversely, let
H= n=1 H n , ( H n) < , n 1,
n=1
( A )= ( H A )= ( ( H n A ) ) , A
1 (H n A)
A
n
1+ ( H n )
n=1 2
( A ) =
Then,
definition of
if
, we know that
( A) ( A)
now complete.
( A )=0
if and only
165
and
Theorem 5.4: If
( B ) ={ ( A B ) ( A ) : A , B } .
(b)
; and
(c)
>0
(b): Let
there exists
> 0
such that
whenever
( A ) ( A)
for any
so that
( A B ) ( A ) +1
whenever
166
A= n=1 A n .
n0 >1
( A A n ) <
such that
. Therefore we have
exists
H such that
is finite.
{ ( A ( B B )) ( A ) : A }
( B1 B2 ) =
{ ( A ( B B )) ( A B ) : A }
1
{ ( A B1) ( A ) : A }
( B1 ) + ( B 2)
167
proved.
5.2.
Let
is trivial by
and the
exhaustivity of
we say that
B B , A B , ( B )=0 .
5.2.1
every
B . If
is a complete
168
Let
( X , B , )
under
is the collection
such that S B T
sets S and T in B
and
(T S )=0 , i.e.
B = { B X : S , T B s .t . S B T ( T S )=0 } .
:B [0, ]
as follows:
( B ) = ( S ) ( ( T ) ) where , S , T B s . t . S B T ( T S )=0
often called
Since
are
-measurable.
is
null-additive.
(T )= [ S ( T S ) ]= ( S ) .
then
. The set in
It
Furthermore, if
(C ) ( T )=( S) . Hence,
( S ) ={ (C ): C BC B } ,
169
follows
CB
immediately
and belongs to
that
( S)
and
(T )
is reasonable.
then
is null-
continuous then
is uniformly auto-
following theorem.
5.2.2
Theorem: Let
( X , B , )
Then
measure on
is a
and
is a fuzzy
is
170
B B
where
AX
and
X B
, and hence
L,M B
L A M
is closed under
and
( M L )=0
M c A c Lc
and
( Lc M c ) = ( Lc M ) = ( M L )=0 .
such that
Ln Bn M n ( M nLn )=0.
n=1
n=1
( M n n=1 Ln ) ( ( M nL n) )=0 .
by the null-additivity of
171
thus
we have
is an extension of
( B ) =( B)
. It is clear that
and satisfies
( )=0 , and
B 1 B2
{Bn }
( B 1) ( B 2)
Mn in B
such that
B 1 , B2
in
and
Ln Bn M n ( M nLn )=0.
then
satisfy
and
j=n
j=n
( H nGn ) = ( GcnH cn ) = ( Lcj j=n M cj ) ( ( LcjM cj ) )=0
172
by the null-additivity of
. Since
n=1
n=1
( H n n=1 Gn ) ( ( H nGn ) ) =0 .
We have,
n=1
n=1
( B n ) = ( Gn ) =lim ( Gn ) =lim ( B n )
n
Hence,
is a fuzzy measure on
sequence of sets in
and
( B 1) <
. It is similar to prove
n=1
A n ) =lim ( A n ) .
n
Consequently,
is a fuzzy measure on
. The completeness of
173
is
5.2.3
Theorem: If
( X , B , )
, it is obvious that
Let A be a set in
and
and
is complete. To prove
is null-additive.
, let sets
Ln
( L1 L2 ) ( B A ) ( M 1 M 2 )
n=1
( ( M 1 M 2 )( L1 L2 ) ) ( 2 ( M nLn ) ) =0
We have
( B A )= ( L1 L2) = ( L1 )= (B)
hence
X , B , )
Hence (
is a complete null-additive fuzzy measure.
174
is null-additive.
5.2.4
Theorem: If
converse-null-additive on
is converse-null-additive on
B
B , then
is also
Ln
satisfy
M n (n=1,2)
and
L M
in
B,
and
and
satisfy
L1 L L2 ( L2 L1) =0,
M 1 M M 2 ( M 2M 1 )=0 since
c
c
( L1) = ( L )= ( M )= ( M 2) [ ( M L ) =( M L ) ] ( M 2 L1)
we have
( M L ) ( M 2 Lc1 ) =0 .
By the converse-null-additivity of
. Hence
is converse-null-additive.
5.2.4
Theorem: If
continuous on
is auto-continuous on
175
B , then
is auto-
and
and
( A n ) 0 (as n )
and satisfy
( M L )=0
Ln
and
Mn
be two sets in B
and satisfy,
( L Ln ) ( B A n ) ( M M n )
M
( M L ) ( nL n)
, we have
[ ( M M n ) ( L Ln ) ]
lim ( B An ) =lim ( L Ln )= ( L )= ( B ) .
Hence
is auto-continuous on B .
5.2.6
Theorem: If
space, then
B =B
( X , B , )
and
= .
176
Proof: Let
B B
L B M
and
( M L )=0 .
( BL) ( M L ) , we have
Since,
such that
( BL) B
by the completeness of
. Hence,
B=[ L (BL) ] B . Thus B B
and so B =B
is proved.
5.3
177
Let
[ X ]
R p . Let
{ f n , f } [ X ] , B B and n 1
(1) We say
f n f
{f n}
on B, if there exists,
A B B , such that
( A )=0
and for
every ( BA ) ,
lim f n ( )=f ( ) .
(2) We say
f n pf
on B, if there exists
A B B , such that
for every ( BA )
178
( B ) =(B A)
and
lim f n ( )=f ( ) .
(3) We say
f n pf
everywhere on C, i.e.
(4) If
C B B , f converges to f pseudo-almost
n
on B, if for every
f n pf
on C.
is null-additive, then
x R p ,d ( x , f n )
d (x , f )
converges to
almost
everywhere i.e.
f n f d ( x , f n ) d ( x , f ) x R p .
(5) If
is pseudo-null-additive and
( B ) < , then
f n p f d ( x , f n) p d ( x , f ) x R .
179
fn
converges to f
x R p ,d ( x , f n )
5.3.1
Definition: Let
{ f n, f } [ X ] , B B
and
n 1 , we say
to f on B, denoted by
f n f
{f n}
on B, if for
Rp ,
lim [ B ( 1 ( K )) ]=0 .
we say
f n p. f
Rp ,
lim [ B1 ( K ) ]= ( B ) .
If for every
f n p. f
CBB ,
to f in B, denoted by
on C, then we say
f n p. f
in B.
180
{ f n } converges
Where,
1 ( K )= { X : F n ( ) K } , B ( 1 ( K ) )= { B : F n ( ) K }
and where
Fn =( f nf ) ( f f n ) , f ( ) ={ x R p : d ( x , f ( ) ) < } and
f n ( )= { x R p : d ( x , f n ( )) < } .
5.3.2
Definition: Let
{ f n , f } [ X ] , B B and n 1 , we say { f n }
a sequence
f nuf
on B, if there exists
lim ( B m )=0
f n p .u . f
on
lim ( BBm )= ( B )
181
CBB ,
f n p .u . f
on C, then we say
uniformly to f in B, denoted by
5.3.3
( BBm ) , m=1,2, .
f n p .u . f
If for every
{ f n } converges pseudo-almost
in B.
{ f n, f } [ X ] , B B
and
n 1 .
(a)
(b)
( B ) < , f n f on B f n f on B ,
f n p f on B f n p . f on B ,
(c)
is null-additive, f n f on B f n p. f on B ,
(d)
( B ) < , f n p f on B f n f on B .
182
respectively. Now we only prove (b), and (a) is similar to prove. Since
f n p f on B
, there exists A B B
such that
lim f n ( )=f ( ) .
Let
( BA ) , for any
>0
R p , there
Then
D n ( , K ) ( B A )
and
D n ( , K ) [ B1 ( K ) ] n=1,2, ..
have,
( D n ) [ B1 ( K ) ] ( B ) . Therefore we have,
lim [ B1 ( K ) ]= ( B ) .
183
so we
{ f n , f } [ X ] , B B and n 1 .
5.3.4
(a) If
f on B ,
is auto-continuous from above, f n
then there exists some
{f n }
subsequence
(b) If
{f n }
i
on B, and
f n f on B ,
f n pf
i
f n pf
i
in B.
on B.
( B ) < , f n p . f on B ,
such that
f n f
some subsequence
(c) If
f n f
i
on B.
184
{f n }
i
of {fn},
(d) If
( B ) < , f n p . f on B ,
such that
f n f
on B and
f n pf
i
{f n }
of {fn},
and
R p be
in B.
l >0
presented as,
p
R = l=1 T l ,
Tl
Where
Rp
f on B
T l T l +1 ,l=1,2, . since f n
, we have
lim B ( 1 ( T l ) ) =0 ,
nl
nl l=1,2, .
then,
185
nl <n l+1
. Let
B ( 1 ( T l ) ) < l at
B l=B ( 1 ( T l ) )
at
lim [ Bl ]=0.
Since
{ Bl }
i
i=1
j=1
( Bl )=0.
j
i=1
j=1
Now we prove for every [ B( B l ) ] , lim f n ( )=f ( ) .
j
li
i ( ) , such that
Bl
, whenever
j i ( ) , that is
Fn ( ) T l = .
li
integer
i0
, such that
l <
i0
and
186
K T l
i0
{ T l } of compact subset K of
Fn ( ) K=
Therefore,
lj
Rp
. Whenever
j i ( ) i 0 . Since,
i=1
j=1
[ B( Bl ) ] is arbitrary thus we have f n f on B .
j
li
By using null-additivity of
, we also have
f n pf
li
in B.
5.3.5
{ f n , f } [ X ] , B B
and
n 1 .
f n f f n u f f n p .u . f ,
(b) If
f n pf f nu f ,
(c) On B
f n pf f n p.u.f
Proof: We only prove (c), (a) and (b) are similar to prove. Let
187
l >0
and
R = l=1 T l
Where
Tl
T l T l +1 l=1,2,
Since
f n pf
on B, there exists
A B B
such that
( BA )= (B) ,
and
B m = n=m { ( B A ) : F n ( ) T l= } ,then Bl B l . ..
l
m=1 Bm =BA .
l
> 0 , since
( B m ) (B)
l
(B i )> ( B ) .
2
j>i
and
Since
( B i Bi ) ( B i )
1
( B i B j ) > ( B ) 2
2 2
1
on,
{ Bi } { Bm }
l
188
, such that,
there
and so
l=1
( B i ) ( B ) .
l
positive integer
l0
G Bi n=i 0 { ( B A ) : Fn ( ) K = } ,
l0
1
That is G ( ( K ) ) =
f n p .u . f
whenever
n N =i 0
, therefore we have
on B.
5.4
The main methodological idea, basic for the theory of fuzzy sets, states that
elements may belong to some collection not absolutely but to some extent. We
discuss a property as continuity of functions and investigate what does it means
that a function is continuous only to some extent. In order to do this in
mathematical terms, some estimations of this property are introduced. These
estimations play a role of the membership function for the fuzzy sets of fuzzy
189
( XT ) <
(2) If
and
such that
f n f
( XK )<
and
f n f
{ f n } converges to f uniformly on k.
190
(3) If
real measurable function on X, then for any > 0 there exists a closed subset
(4) If
( XT ) .
(5) Let
{ f n , f } [ X ] , ( B ) < and n 1 .
(a) If
measure
>0
( XT ) .
{ f n } converges in fuzzy
{f n }
ij
of
191
{f n }
i
there exists a
, such that
{f n }
i
f n f
ij
of
{f n} ,
on B.
(b) If
f n p..f
fn pf
ij
(a) If
{f n }
i
{ f n } , there
, such that
{ f n , f } [ X ] , ( B ) < and n 1 .
xRp,
(b) If
of
ij
of
on B.
(6) If
{f n }
{f n }
(or in B),
d ( x , f n) d ( x , f )
f nf
for every
f n p..f
for every
xR ,
d ( x , f n) p . .d ( x , f )
(7) If
{ f n , f } [ X ] , ( B ) < and n 1 .
(a) If
192
f nu f
for every
xR ,
d ( x , f n) u d ( x , f )
(b) If
(or in B),
f n p..f
(8) If
for every
d ( x , f n) p . .d ( x , f )
xRp,
be a fuzzy measure on a
B . Let
algebra
f n( ) z ( ) , X
n X
lim f n d= f d .
193
integrable
Reference:
[1] A. De Luca, and S. Termini, (1972), A definition of non-probabilistic
entropy in the setting of fuzzy sets, Inform. and Control, 20, 301.
[2] A. Finstein, (1958), Foundations of Information Theory, New York.
[3] A. Hiroto, (1982), Ambiguity based on the concept of subjective entropy, in:
M.M. Gupta and E. Sanchez, Eds., Fuzzy Information and Decision Processes,
North-Holland Amsterdam.
[4] A. P. Dempster, (1967), Upper and lower probabilities induced by a
multivalued mapping. Annals Mathematical Statistics 38: 325339.
[5] A. Wehrl, (1978), General properties of entropy. Rev. Mod. Phys., 50, 221260.
[6] Ann Markusen, (2003), fuzzy Concepts, Scanty Evidence, Policy Distance:
The Case for Rigour and Policy Relevance in Critical Regional Studies. In:
Regional Studies, volume 37, Issue 6-7, pp. 701-717.
194
[7] Alexander S. Kechris, (1995), Classical descriptive set theory. SpringerVerlag, Graduate texts in math., vol. 156.
[8] B. R. Gaines, and L. Kohout, (1975), Possible automata. Proc. Int. Symp.
Multiple-Valued logics, Bloomington, IN, 183-196.
[9] Bhaskara Rao, K. P. S. and M. Bhaskara Rao, (1983), Theory of Charges: A
Study of Finitely Additive Measures, London: Academic press, x+315, ISBN 012-095780-9.
[10] C. E. Shannon, (1948), A mathematical theory of communication. Bell
Syst. J. 27, 379-423, 623-656.
[11] C. Wu and M. Ha, (1994), On the regularity of the fuzzy measure on metric
fuzzy measure spaces, Fuzzy Sets and Systems, 66, 373-379.
[12] Claude E. Shannon and Warren Weaver, (1949), The Mathematical Theory
of Communication, The University of Illinois Press, Urbana, Illinois. ISBN 0252-72548-4.
[13] D. Cayrac, D. Dubois, M. Haziza and H. Prade, (1996), Handling
uncertainty with possibility theory and fuzzy sets in a satellite fault diagnosis
application. IEEE Trans. on Fuzzy Systems, 4, 251-269.
[14] D. Dubois and H. Prade, (1997), Bayesian conditioning in possibility
theory, Fuzzy Sets and Systems 92, 223-240.
195
[15] D. Dubois and H. Prade, (1996), What are fuzzy rules and how to use
them. Fuzzy Sets and Systems, 84, 169-185.
[16] D. Dubois and H. Prade, (1992), When upper probabilities are possibility
measures, Fuzzy Sets and Systems 49, 65-74.
[17] D. Dubois, H. Prade and S. Sandri, (1993), On possibility/probability
transformations. In R. Lowen, M. Roubens, editors. Fuzzy logic: State of Art,
Dordrecht: Kluwer Academic Publ., 103-112.
[18] D. H. Fremlin, (2000), Measure Theory (http://www.essex.ac.uk/maths/
people/ fremlin/mt.htm). Torres Fremlin.
[19] D. Harmanec, and G. J. Klir, (1996), Principle of uncertainty invariance.
International Fuzzy Systems and Intelligent Control Conference (IFSICC), 331339.
[20] D. Ralescu and G. Adams, (1980), The fuzzy integral, J. Math. Anal. Appl.,
75, 562-570.
[21] De Campos and M. J. Balanos, (1989), Representation of fuzzy measures
through probabilities, Fuzzy Sets and Systems 31, 23-36.
[22] Didier Dubois and Henri Prade, (2001), Possibility Theory, Probability
Theory and Multiple-valued Logics: A Clarification, Annals of Mathematics
and Artificial Intelligence 32, 3566.
196
197
198
[40] G. N. Lewis, (1927), The entropy of radiation. Proc. Natl. Acad. Sci.
U.S.A., 13, 307-314.
[41] G. Jumarie, (1975), Further advances on the general thermodynamics of
open systems via information theory, effective entropy, negative information,
Internat. J. Systems Sci. 6, 249-268.
[42] Gerald B. Folland, (1999), Real analysis: Modern Techniques and Their
Applications, John Wiley and Sons, ISBN [Special: Book Sources/
04713171600|04713171600] Second edition.
[43] Glenn Shafer, (1976), A Mathematical Theory of Evidence, Princeton
University Press, ISBN 0-608-02508-9.
[44] Gustave Choquet (1953). Theory of Capacities. Annales de l'Institut
Fourier 5, 131295.
[45] H. Suzuki, (1991), Atoms of fuzzy measures and fuzzy integrals, Fuzzy
Sets and Systems 41, 329-342.
[46] H. Suzuki, (1988), On fuzzy measures defined by fuzzy integrals. J. Math.
Anal. Appl. 132, 87-101.
[47] H. Tahani and J. Keller, (1990), Information Fusion in Computer Vision
Using the Fuzzy Integral, IEEE Transactions on Systems, Man and
Cybernetic 20 (3), 733741. Doi, 10.1109/21.57289.
199
[48] Ines Couso, Susana Motes and Pedro Gil, (2002), Stochastic convergence,
uniform integrability and convergence in mean on fuzzy measure spaces, Fuzzy
Sets and Systems, 129, 95-104.
[49] Irem Dikmen, M. Talat Birgonal and Sedal Han,(July 2007)Using fuzzy
risk assessment to rate cost overrun risk in international construction projects.
International Journal of Project Management, Vol. 25 no. 5, 494-505.
[50] J. F. Geer and G. J. Klir, (1992), A mathematical analysis of informationpreserving transformations between probabilistic and possibilistic formulations
of uncertainty, International Journal of General systems, 20(2): 143-176.
[51] J. F. Geer, and G. J. Klir, (1991), Discord in Possibility theory. Intern.J. of
General Systems, 19, No. 2, 119-132.
[52] J. L. Marichal and M. Roubens, (2000), Entropy of discrete fuzzy
measures: International Journal of Uncertainty, Fuzziness and KnowledgeBased Systems, 8(6): 625-640.
[53] J. N. Kapur, G. Baciu and H. K. Kesavan (1995), The min max information
measure. International Journal of Systems Science, 26(1): 1-12.
[54] J. Aczel and Z. Daroczy, (1975), On measures of information and their
characterization, Academic Press, New York- San Francisco, London.
[55] J. Goguen, (1967), L-fuzzy sets, J. Math. Anal. Appl.,18,145-174.
200
201
202
[73] Mark Burgin, (1999), General approach to continuity measures, Fuzzy Sets
and Systems, 105, 225-231.
[74] Mark Stosberg, (16 December 1996). The Role of Fuzziness in Artificial
Intelligence. Minds and Machines. Retrieved 19 april 2013.
[75] Masao Mukaidono, (2001), Fuzzy logic for beginners. Singapore: World
Scientific Publishing.
[76] Max Black, (1937). Vagueness: An exercise in logical analysis, Philosophy
of Science 4: 427455. Reprinted in R. Keefe, P. Smith (eds.): Vagueness: A
Reader, MIT Press 1997, ISBN 978-0-262-61145-9.
[77] P. M. Murphy and D. W. Aha, UCI repository machine learning databases,
http://www.ics.uci.edu/mlearn/MLrepository.html, Irvine CA:University.
[78] P. Miranda, M. Grabisch, P.Gil, (2002), p-symmetric fuzzy measures. Int.
J. of Unc., Fuzzy and knowledge-based systems 10 (Supplement) 105-123.
[79] P. R. Halmos, (1968) Measure Theory, Van Nostrand, Princeton, N. J.,
(1962). Van Nostrand Reinhold, New York.
[80] Q. Jiang and H. Suzuki, (1996), Fuzzy measures on metric spaces, Fuzzy
Sets and Systems, 83, 99-106.
[81] Q. Jiang and H. Suzuki, (1995), Lebesgue and Saks decompositions of
finite fuzzy measures, Fuzzy Sets and Systems, 75, 373-385.
203
finite fuzzy
204
[91] Richard Dietz & Sebastiano Moruzzi (eds.), (2009), Cuts and clouds.
Vagueness, Its Nature, and Its Logic, Oxford University Press.
[92] Roy T. Cook, (2009), A dictionary of philosophical logic, Edinburgh
University Press, 84.
[93] Russell Bertrand (1950), Unpopular Essays, Allen &Unwin.
[94] S. Kullback, (1958), Information Theory and Statistics, Wiley, New York.
[95] S. Selvin, (1975), A Problem in Probability, American Statistician, 29,1-67
[96] S. Weber, (1984),
Representation, the Choquet integral and null sets. J. Math. Anal. And Appl.
159 (2), 532-549.
[100] V. I. Bogachev, (2007), Measure theory, Berlin: Springer, ISBN 978-3540-34513-8.
205
206
[110] Z. Wang, (1984), The auto-continuity of set function and the fuzzy
integral, J. Math. Anal. Appl. 995, 195-218.
207