You are on page 1of 6

International Journal of Scientific & Engineering Research, Volume 1, Issue 3, December-2010 1

ISSN 2229-5518

General Pseudoadditivity of Kapur’s Entropy


prescribed by the existence of equilibrium
Priti Gupta, Vijay Kumar
Abstract—In this paper, the concept of composibility of the total channel capacity of the system of channels composed of two
independent subsystems of channel is discussed. This system of channels is a function of entropies of the subsystems. Here,
we show that the generalized entropies, the Tsalli’s entropy, the normalized Tsalli’s entropy and the Kapur’s entropy are
compatible with the composition law and defined consistently with the condition of equilibrium and satisfies pseudo additivity.

Key words— Entropy, Kapur’s Entropy, Pseudo-additivity

—————————— ——————————

1 INTRODUCTION
In the process of transmission of signals, a communication certain categories of information only.
system has to do with the amount of information, the The channels can be modeled physically by trying to calculate
capacity of the communication channel and the effects of the physical processes which modify the transmission of
noise. Communication is a continuous process and a chan- signals.
nel is the ‘pipe’ along which a information is conveyed, and
there are a wide variety of different communication channels For example:: In wireless communications the channel is often
available, from basic face-to-face conversation, through to modeled by a random attenuation (known as fading) of the
telecommunication channels like the telephone or e-mail and transmitted signal, followed by additive noise. It can be
computational channels like the medical record. Channels modeled by calculating the reflection of every object in the
have attributes like capacity and noise, which determine environment.
their suitability for different tasks. When two parties ex-
change information across a channel at the same time, this is The attenuation term is a simplification of the underlying
known as synchronous communication. physical processes and captures the change in signal power
over the course of the transmission. The noise in the model
For example:: Telephones are one of the commonest two-way captures external interference and/or electronic noise in the
synchronous channels. It is the nature of synchronous com- receiver. If the attenuation term is complex, it describes the
munication that it is interruptive, and these interruptions relative time that a signal takes to get through the channel.
may have a negative impact on individuals who have high The statistics of the random attenuation are decided by
cognitive loads. previous measurements or physical simulations.

On the contrary, when individuals can be separated in time, Channel models may be continuous channel models in that
they may use an asynchronous channel to support their inte- there is no limit to how precisely their values may be defined.
raction. Since there can be no simultaneous discussion, con-
versations occur through a series of message exchanges. This As of today, it is now possible to assign multiple custom
can range from Post-it notes left on a colleague’s desk, to channels to a single ad unit. This feature enables us to track
sophisticated electronic information systems. One of the our ad performance with greater flexibility and view more
benefits of asynchronous communication is that it is not in- granular information. When generating our ad code, we will
herently interruptive, and if a communication is not urgent,
be able to add up to five custom channels to a specific in-
asynchronous channels may be a preferred way of commu-
stance of ad code. Multiple channels can be very useful
nicating with otherwise busy individuals.
when we want to track one ad unit across several different
metrics simultaneously.
A communication system is thus a bundle of different com-
ponents and the utility of the overall system is determined by
For example: To run a sports website and we have to place a
the appropriateness of all the components together.
leader board at the top and bottom of every page. To track
If even one element of the system bundle is inappropriate to
the performance of the ad placement, we have to create two
the setting, the communication system can under perform.
custom channels - ‘Top Leader board’ and ‘Bottom Leader
A communication system provides only a channel for mu-
board’ and regenerated our ad code appropriately.
tual information exchange which is not apriori dedicated to
IJSER © 2010
http://www.ijser.org
International Journal of Scientific & Engineering Research, Volume 1, Issue 3, December-2010 2
ISSN 2229-5518

Also, if we also want to compare your football pages and H (K )


( p1 , p2 , ...., p w ) H (K )
( pl , pm ) ( pl ) H (K ) p1 p2
,
p
,......., w
your baseball pages at the same time. With multiple custom pl pl pl
channels, this is not a problem. Just create two new custom
channels called ‘Football Pages’ and ‘Baseball Pages’ and (K ) p1 p 2 p
add them to the appropriate ad units. Now, each leader ( pm ) H , ,......., w
pm p m pm
boards will be tagged with two custom channels that let us
wl w
know the position they are in (top or bottom), and the type
of page on which they appear (football or baseball). Moreo- Where pl pi , pm pi with wl wm w
ver, ad units tagged with multiple custom channels will log i 1 i wl 1
impressions or clicks in each channel. As a result, we will see
pi pi
large number of impressions and clicks when we view chan and the sets and are conditional probabili-
nel reports than aggregate reports. pl pm
ties.
Generally, you can hear the static hiss on a radio or see it as
snow on a television screen. Shannon realized that the
noise added to one digital pulse would generally make Also, pi pi iff 1 and pi pi iff 1
the overall amplitude different from that of another, (K)
otherwise identical, pulse. Further, the noise amplitudes Here, H ( p1 , p 2 , ...., pw ) is of the form of a homo-
for two pulses are independent. geneous function.

In the present work, we are trying to discuss the additivity


(noise) of subsystems of channels. Here, we are considering
two subsystems of a channel. 2. COMPOSABILITY OF VARIOUS ENTROPIES
PRESCRIBED BY THE EXISTENCE OF
The additive requirement for the system(channel) puts con- EQUILIBRIUM
straints on the symmetry of the system under consideration:
it is indivisibly connected with the homogeneity of the sys-
Let X ( A, B ) be the total channel capacity of the system of
tem, which is a familiar additional assumption in ordi-
nary circumstances . Channels exhibit complex sys- channels and is divided into two independent subsystems of
tems/processes and attracting great attention. A common channels, A and B.
feature of these systems is that they stay in non equilibrium
stationary states for significantly long periods. In these sys- Let X ( A, B ) satisfy additivity as:
tems, channels are generally in homogeneous. X ( A, B) X ( A) X ( B)
The joint probability of the total system of channel is
In this paper, we discuss a general composition law for the
system and satisfies pseudoadditivity. There, three genera-
lized entropies, the Tsalli’s entropy, the normalized Tsalli’s Pij ( A, B ) Pi ( A) Pj ( B ) (2.1)
entropy, and the Kapur’s entropy, are shown to be consistent
Where pi (A) is the probability of finding the subsystem
with the equilibrium condition. Abe [9] discuss the thermo-
dynamic system, but here we are taking the system of chan- A in its i th state and
nels for the same entropies.
p j (B ) is the probability of finding the subsystem B in its
From a thermodynamic viewpoint, the heat disturbs the sig-
nal, making it noisy.
j th state.

The entropy of the total system, H ( A, B ) H p( A, B) ,


1.1 KAPUR’S ENTROPY is expressed in terms of the entropies of the subsys-
tems, H ( A) and H (B) , i.e,
Another characterization, which obtains for a partition of
probability space for two segments, one for low valued prob-
H ( A, B) f H ( A), H ( B)
abilities and other for high valued probabilities, of pseudo-
additivity of sub-systems A and B is:
Where f is symmetrical bivariate function:
f ( x, y ) f ( y , x) (2..2)

IJSER © 2010
http://www.ijser.org
International Journal of Scientific & Engineering Research, Volume 1, Issue 3, December-2010 3
ISSN 2229-5518

The given function is a composite function. Also, H ( NT ) ( A, B ) H ( NT ) ( A) H ( NT ) ( B )


( 1) H ( NT ) ( A) H ( NT ) ( B ) (2.9)
According to Boltzmann-Gibbs-Shannon entropy [3]
N
gives pseudoadditivity as (2.7)
H ( p) pi ln( pi ) (2.3)
i 1
Where N is the total no. of states at a given scale. Here, we notice two important points:

For independent variables A and B , (i) As 1 , Renyi’s Entropy, Tsalli’s Entropy


and normalized Tsalli’s Entropy all converges
H ( A, B) H ( A) H ( B) (2.4) to Boltzmann-Gibbs-Shannon Entropy.

(ii) For (0,1) , Renyi’s Entropy and normal-


From [8],
H(A , B ) H ( A) H ( B) ( ) H ( A).H ( B) ized Tsalli’s Entropy are concave only and for
0 , Tsalli’s Entropy is always concave.
Where ( ) is a function of the entropic index. The generalized entropies of Kapur of order and
type [7] is
According to Renyi’s Entropy [1]
N

1 N pi
H ( R ) ( p) ln ( pi ) ( 0) (2.5) K 1 i 1
1 H , ( p) log 2 N
,
i 1
pi
and Tsalli’s Entropy [4] i 1
, , 0 (2.10)
N
1
H (T ) ( p ) ( pi ) 1 ( 0) (2.6) In the limiting case,
1 i 1

(R) When 1 and 1 , H K, ( p) reduces to H ( p)


Since H ( p ) and H (T ) ( p ) are composite functions and
satisfies pseudoadditivity as: and when 1 , H K, ( p) reduces to H ( R ) ( p) .

Also, H K, ( p) is a composite function which satisfies


H (T ) ( A, B ) H (T ) ( A) H (T ) ( B )
pseudoadditivity as:
(1 ) H (T ) ( A) H (T ) ( B ) (2.7)

H K, ( A, B) H K, ( A) H K, ( B )
The normalized Tsalli’s Entropy is
( , ) H K, ( A) H K, ( B) (2.10 a)

H (T ) ( p)
H ( NT ) ( p ) N
The above Entropies have a common feature “ Sum plus
product ” as:
( pi )
i 1
f ( x, y ) x y ( ) xy
1 1
1 (2.8) Where ( ) is the frequency of channel .
N
1
( pi )
i 1 For Renyi’s Entropy and Boltzmann-Gibbs Shannon Entro-
py , ( ) 0
IJSER © 2010
http://www.ijser.org
International Journal of Scientific & Engineering Research, Volume 1, Issue 3, December-2010 4
ISSN 2229-5518

symmetrical function due to (2..2)


For Tsalli’s Entropy, ( ) 1
In order to establish (2.14), in most physical systems k is in-
For Normalized Tsalli’s Entropy , ( ) 1 and dependent of the entropies of subsystems and the possibility
of k to be entropy dependent is limited, in general.
Assuming that, For Kapur’s Entropy,
Henceforth, k will be taken to be an unity.
( , ) (1 )(1 )
Equations (2.15) and (2.16) becomes as:
The additivity of channel capacity is not obvious, this makes
the channel highly non-trivial for non-extensive systems
with long range interactions.
f H ( A), H ( B)
g ( H ( A)) h( H ( B)) (2.17)
The stationary state of the total system may be defined by the ( H ( A))
maximization of total entropy with fixed X ( A, B ) , i.e,

f H ( A), H ( B)
h( H ( A)) g ( H ( B)) (2.18)
H ( A, B) f H ( A), H ( B) 0 (2.11) ( H ( B))

X ( A) X ( B) (2.12) It is obvious that

From these equations it follows that, 2


f H ( A), H ( B) 2
f H ( A), H ( B)
(2.19)
( H ( A)) ( H ( B)) ( H ( B)) ( H ( A))
f H ( A), H ( B) d ( H ( A))
( H ( A)) d ( X ( A))
Differentiate (2.17) with H (B) and (2.18) with H ( A)
f H ( A), H ( B) d ( H ( B)) (2.13) and put it in (2.19), we get
( H ( B)) d ( X ( B))
1 dh( H ( A)) 1 dh( H ( B))
In equilibrium state, ( A) ( B) (2.14) g ( H ( A)) d ( H ( A)) g ( H ( B)) d ( H ( B))

(2.20)
Where is the frequency, if X is the channel capacity of
the system
Where is a frequency and is known as separation constant.
Separation of variables can only be possible if the derivation
of the composite function satisfy Put the value of g ( H ( A)) and g ( H ( B)) in (2.17) and
(2.18), we get
f H ( A), H ( B )
( H ( A))
k ( H ( A), H ( B )) g ( H ( A)) h( H ( B )) (2.15) f H ( A), H ( B)
( H ( A))
f H ( A), H ( B) 1 dh ( H ( A))
h ( H ( B)) (2.21)
( H ( B)) d ( H ( A))
k ( H ( A), H ( B )) h( H ( A)) g ( H ( B )) (2.16)
f H ( A), H ( B)
where k , g and h are some functions. Also k is a ( H ( B))
IJSER © 2010
http://www.ijser.org
International Journal of Scientific & Engineering Research, Volume 1, Issue 3, December-2010 5
ISSN 2229-5518

1 dh ( H ( B))
h ( H ( A)) (2.22)
d ( H ( B)) Or h (H ) H 1 (2.28)

Integrating (2.21) with H ( A) and (2.22) with H (B) , we Put (2.26) in (2.23), we get
get
H ( A, B) f H ( A), H ( B)
f H ( A), H ( B)
H ( A) H (B) H ( A) H ( B) (2.29)
1
h ( H ( A)) h ( H ( B )) c (2.23)

Where c is the constant of integration. Eq. (2.29) is similar to eq. (2.10a).

This shows the validity of composition law of the system for


Kapur’s entropy and satisfies pseudoadditivity.
In order to get convergent result in the limit 0,

1 3. CONCLUSION
Without loss of generality, Setting c
Here, we have derived the most general pseudoadditivity
and simultaneously impose the condition rule for composite entropy based only on the existence of
equilibrium. We have shown how the three generalized en-
lt h ( H ) 1, H tropies type pseudoadditivity can be obtained as the sim-
(2.24)
0 plest case. In the present work, composability of entropy has
been taken as a basic premise. This concept puts a stringent
constraint on possible forms of entropies. In [10] the entropy
arising from the idea of quantum groups is not composable.
Thus, f H ( A), H ( B) Composability does not seem to be possible to define the
equilibrium states in the standard manner, if entropy does
1 1 not satisfy composability.
h ( H ( A)) h ( H ( B)) (2.25)
However, it is worth noting that composability assumes
divisibility of the total system into independent subsystems.
Actually reliability of this independence also puts stringent
Eq. (2.25) is the form of the composite function prescribed by constraints on physical nature of the systems.
the existence of equilibrium.
For example: If the total system contains a long range inte-
In order to find h , ractions between its microscopic components, the indepen-
dence of the subsystems may hardly be realiable in general.
let us suppose that both A and B are in the completely In such a situation, thorough generalization of the standard
ordered state, i.e, formalism may be required.
H ( A, B) = 0,
REFERENCES
h (0) 1
[1] A. Renyi’s, 1961, On measures of entropy and information,

Again, let us suppose that only B is in the completely Pro ceeding of 4th Berkeley Symposium on Mathematical statis-
ordered state, i.e, tics and Probability, Vol. 1, pp. 547-561.

H ( A, B) H ( A) (2.26) [2] B. Lesche, 1982, Journal of Statistical Physics, 27, 419.

[3] C. E Shannon, 1948, The Mathematical Theory of Com-


By (2.25), we get mu nication, Bell’s systems Tech. Journal, 27, pp. 279-423.
1 1 [4] C. Tsallis, Non-extensive Statistical Mechanics and
H ( A) h ( H ( A)) (2.27)
Ther mody namics, Historical Background and Present Status.
IJSER © 2010
http://www.ijser.org
International Journal of Scientific & Engineering Research, Volume 1, Issue 3, December-2010 6
ISSN 2229-5518

In: Abe, S., Okamoto, Y. (eds.) Nonextensive Statistic-


al Mechanics and Its Applications, Springer-Verlag Heidelberg,
2001.
[5] C. Tsallis, 1988, Possible generalization of Boltzmann-Gibbs
sta tistics, Journal of Statistical Physics, 52, pp. 479-487.
[6] J. N. Kapur, H. K. Kesavan, Entropy optimisation Principles
with Applications, Academic Press, Boston,1994.
[7] J.N Kapur, 1967, Generalization entropy of order and type ,
The mathematical seminar, 4 ,pp. 78-84.
[8] P.T. Landsberg, V. Vedral, 1998, Physics Letters A 247, 211.
[9] S. Abe, 2001, General pseudoadditivity of composable entropy
prescribed by the existence of equilibrium, Physics Review E
63, 061105.
[10] S. Abe, 1997, Physics Letters A 224, 326.

IJSER © 2010
http://www.ijser.org

You might also like