Escolar Documentos
Profissional Documentos
Cultura Documentos
Home Page
and Non-extensive Statistical
Mechanics Title Page
A. M. MATHAI JJ II
[Emeritus Professor of Mathematics and Statistics, McGill University,
Canada, and Director, Centre for Mathematical Sciences, India (Pala
Campus, Arunapuram P.O., Palai-686 574, Kerala, India)] J I
and
Page 1 of 26
Hans J. HAUBOLD
[Office of Outer Space Affairs, United Nations, Vienna International
Go Back
Center, P.O. Box 500, A-1400 Vienna, Austria]
Full Screen
Close
Quit
Quit
Section 1 to illustrate that simply because of PPP one need not expect
additivity to hold or that one should not expect this PPP should lead
Go Back
to extensivity. The types of non-extensivity, associated with the various
generalized entropies, are pointed out even when PPP holds. The nature
of non-extensivity that can be expected from a multivariate distribution,
Full Screen
when PPP holds or when there is statistical independence of the random
variables, is illustrated by taking a trivariate case.
Close
Quit
trivial cases of the sure event S and the impossible event are deleted, still
this definition becomes a resultant of some properties of positive numbers.
Full Screen
Consider a sample space of n distinct elementary events. If symmetry in
the outcomes is assumed then we will assign equal probabilities n1 each to
the elementary events. Let C = A B. If A and B are independent then
Close
P (C) = P (A)P (B). Let
Quit
Quit
ln( ki=1 pi )
P
Rk, (P ) = , 6= 1, > 0, (4) JJ II
1
(Renyi entropy of order of 1961)
Pk
i=1 pi 1 J I
Hk, (P ) = , 6= 1, > 0 (5)
21 1
(Havrda-Charvat entropy of order of 1967)
Pk
i=1 pi 1
Page 9 of 26
Tk, (P ) = , 6= 1, > 0 (Tsallis entropy of 1988)(6)
1
Pk 2
i=1 pi 1
Mk, (P ) = , 6= 1, < < 2 (7) Go Back
1
(Mathai entropy of 2006)
ln( ki=1 p2
P Full Screen
i )
Mk, (P ) = , 6= 1, < < 2, (8)
1
(Mathais additive entropy of 2006). Close
Let us examine to see what happens to the above entropies in the case
of
Pma joint
Pn distribution. Let pij > 0, i = 1, ..., m, j = 1, ..., n such that JJ II
i=1 j=1 pij = 1. This is a bivariate situation of a discrete distribution.
Then the entropy in the joint distribution, for example,
Pm Pn 2 J I
i=1 j=1 pij 1
Mm,n, (P, Q) = . (10)
1
Page 10 of 26
If the PPP holds and if pij = pi qj , p1 + ... + pm = 1, q1 + ... + qn = 1, pi >
0, i = 1, ..., m, qj > 0, j = 1, ..., n and if P = (p1 , ..., pm ), Q = (q1 , ..., qn )
then
Go Back
m
! n
!
1 X X
( 1)Mm, (P )Mn, (Q) = p2
i 1 qj2 1 Full Screen
1 i=1 j=1
" m n m n
#
1 XX X X
= p2 qj2 p2 qj2 + 1 Close
1 i=1 j=1 i i=1
i
j=1
= Mm,n, (P, Q) Mm, (P ) Mn, (Q).
Quit
If any one of the above mentioned generalized entropies in (4) to (8) is Title Page
written as Fm,n, (P, Q) then we have the relation
where
J I
When a() = 0 the entropy is called additive and when a() 6= 0 the Full Screen
entropy is called non-additive. As can be expected, when a logarithmic
function is involved, as in the cases of Sk (P ), Rk, (P ), Mk, (P ), the entropy
is additive and a() = 0. Close
Quit
pijk > 0, i = 1, ..., m, j = 1, ..., n, k = 1, ..., r such that i=1 j=1 rk=1 pijk =
P
1. If the PPP holds mutually, that is, pair-wise as well as jointly, which
Title Page
then will imply that
m
X n
X r
X JJ II
pijk = pi qj sk , pi = 1, qj = 1, sk = 1,
i=1 j=1 k=1
P = (p1 , ..., pm ), Q = (q1 , ..., qn ), S = (s1 , ..., sr ). J I
Go Back
Fm,n,r, (P, Q, S) = Fm, (P ) + Fn, (Q) + Fr, (S) + a()[Fm, (P )Fn, (Q)
+Fm, (P )Fr, (S) + Fn, (Q)Fr, (S)]
+[a()]2 Fm, (P )Fn, (Q)Fr, (S) (14) Full Screen
where a() is the same as in (13). The same procedure can be extended to
any multivariable situation. If a() = 0 we may call the entropy additive Close
and if a() 6= 0 then the entropy is non-additive or we call the correspond-
ing statistical mechanics extensive and non-extensive respectively.
Quit
y x
f (x) + b (x)f = f (y) + b (y)f (16) Page 13 of 26
1x 1y
for x, y [0, ) with x + y [0, 1], with the boundary condition
Go Back
f (0) = f (1) (17)
where Full Screen
Close
Quit
Close
Quit
dependence holds then in the continuous case also we have the property in
(12) and (14) and then non-additivity holds for the measures analogous to
Title Page
the ones in (3),(5),(6),(7) with a() remaining the same. Since the steps
are parallel a separate derivation is not given here.
JJ II
2. Maximum Entropy Principle
If we have a multinomial population P = (p1 , ..., pk ), pi > 0, i =
J I
1, ..., k, p1 + ... + pk = 1 or the scheme P (Ai ) = pi , A1 ... Ak = S,
P (S) = 1, Ai Aj = , i 6= j then we know that the maximum uncertainty
in the scheme or the minimum information from the scheme is obtained Page 16 of 26
when we cannot give any preference to the occurrence of any particular
event or when the events are equally likely or when p1 = p2 = ... = pk = k1 .
In this case, Shannon entropy becomes, Go Back
k
1 1 X1 1
Sk (P ) = Sk ( , ..., ) = A ln = A ln k (25) Full Screen
k k i=1
k k
and this is the maximum uncertainty or maximum Shannon entropy in
this scheme. If the arbitrary functional f is to be fixedR by maximizing
Close
U = [f (x)] [f (x)]
Title Page
where is a Lagrangian multiplier. Then the Euler equation is the fol-
lowing:
1
1 JJ II
U
= 0 f 1 = 0 f = = constant. (26)
f
J I
Hence f is the uniform density in this case, analogous to the equally
R likely
situation in the multinomial case. If the first moment E(x) = xf (x)dx
is assumed to be a given quantity for all functional f then U will become Page 17 of 26
the following for (19) to (21).
and the Euler equation leads to the power law. That is,
Full Screen
1
U 1 2 1
= 0 f 1 2 x = 0 f = c 1 1 + x . (27)
f 1 Close
Quit
Page 18 of 26
U = f 1 f + 2 xf
and
U
Go Back
= 0 f 1 [1 + 2 x] = 1 f = 1
f (1+3 x) 1
1
f = 1 [1 + 3 x] 1 Full Screen
Close
Quit
where 1 s(1 )x > 0. For < 1 or < < 1 the right side
of (31) remains as a generalized type-1 beta model with the corresponding
normalizing constant c1 . For > 1, writing 1 = ( 1) the model JJ II
in (31) goes to a generalized type-2 beta form, namely,
1 J I
f2 = c2 [1 + s( 1)x ] 1 . (32)
When 1 in (31) or in (32) we have an extended or stretched expo-
nential form, Page 20 of 26
f3 = c3 es x . (33)
If c1 in (30) is taken as positive then (30) for < 1, > 1, 1 will Go Back
be increasing exponentially. Hence all possible forms are available from
(30). The model in (31) is a special case of Mathais pathway model and
for a discussion of the matrix-variate pathway model see Mathai (2005). Full Screen
Close
Instead of optimizing M (f ) of (22) under the conditions that f (x) 0
Rb Rb
for all x, a f (x)dx = 1 and a x f (x)dx is fixed, let us optimize under the
Quit
Z b Z b
(1)(1)
x f (x)dx = fixed , x(1)(1)+ f (x)dx = fixed. Title Page
a a
and for c = s(1 ), s > 0, we have Mathais pathway model for the
real scalar case, namely Page 21 of 26
1
f (x) = c x1 [1 s(1 )x ] 1 , > 0, s > 0 (34)
Go Back
where c is the normalizing constant. For < 1, (34) gives a generalized
type-1 beta form, for > 1 it gives a generalized type-2 beta form and
for 1 we have a generalized gamma form. For > 1, (34) gives Full Screen
the superstatistics of Beck (2006) and Beck and Cohen (2003). For =
1, = 1, (34) gives Tsallis statistics. All types of densities appearing in
astrophysics problems and other areas are seen to be special cases of (34), Close
a discussion of which may be seen from Mathai and Haubold (2006a). For
example, (34) for = 2, = 3, 1, x > 0 is the Maxwell-Boltzmann
Quit
Full Screen
Close
Quit
JJ II
3. Differential Equations
The functional part in (34), for a more general exponent, namely
J I
f (x)
g(x) = = x1 [1 s(1 )x ] 1 , 6= 1, > 0, > 0, s > 0 (38)
c
is seen to satisfy the following differential equation for 6= 1. Page 23 of 26
d
x g(x) = ( 1)x1 [1 s(1 )x ] 1
dx Go Back
(1)
+1 [1 ]
sx [1 s(1 )x ] 1 . (39)
(1)(1)
Then for =
, 6= 1, > 1 we have Full Screen
d (1)
x g(x) = ( 1)g(x) s[g(x)]1 (40)
dx Close
References
J I
Beck, C. (2006). Stretched exponentials from superstatitics. Physica, A,
365, 96-101.
Page 24 of 26
Quit
Mathai, A.M. and Haubold, H.J. (2006). Pathway model, Tsallis statistics,
superstatistics and a generalized measure of entropy. Physica A , (in press) JJ II
Mathai, A.M. and Rathie, P.N. (1975). Basic Concepts in Information Page 25 of 26
Theory and Statistics: Axiomatic Foundations and Applications, Wiley
Halsted, New York and Wiley Eastern, New Delhi.
Go Back
Mathai, A.M. and Rathie, P.N. (1976). Recent contributions to axiomatic
definitions of information and statistical measures through functional equa-
tions. In Essays in Probability and Statistics, (eds: S.Ikeda et al), Shinko Full Screen
Close
Renyi, A. (1961). On measure of entropy and information. Proceedings of
the Fourth Berkeley Symposium on Mathematical Statistics and Probabil-
Quit
Page 26 of 26
Go Back
Full Screen
Close
Quit