Professional Documents
Culture Documents
Yuan Luo
December 3, 2009
Weak Typicality
Then
1
logP (X) H(q)| < }
n
n
1
lim P r{| logP (X) H(p)| > }
n
n
= 0.
lim P r{|
Then
1
logP (X) H(q)| < }
n
n
1
lim P r{| logP (X) H(p)| < }
n
n
= 1.
lim P r{|
Problem 2 (p70.6) Let p and q be two probability distributions on the same
alphabet with the same support. Prove that for any > 0,
1
logQ(x) (H(p) + D(p||q))| < }| 2n(H(p)+D(p||q)+) ,
n
1
lim P r{| logQ(X) (H(p) + D(p||q))| < } = 1,
n
n
|{x n : |
where X is composed of i.i.d. random variables with generic r.v. drawn according to distribution p, Q also denoted by q n is another n dimensional distribution.
Note that, x = (x1 , , xn ).
Proof. Denote W = {x n : | n1 logQ(x) (H(p) + D(p||q))| < }. For
x W , we have
Q(x) 2n(H(p)+D(p||q)+) .
Then it follows from Q(x) 1 that |W | 2n(H(p)+D(p||q)+) .
The second part of this problem is easy to be veried by using
)
( 1
1
1
1
EP [ logQ(X)]H(p) =
P (x)logQ(x) + P (x)logP (x) = D(P ||Q) = D(p||q)
n
n
n
n
x
where P = pn .
Problem 3 (p70.6) Universal source coding ...
1. Proof.
n
lim P r{X(s) An (S)} lim P r{X(s) W[X
s ] } = 1.
2. Proof.
|An (S)| = |
n
W[X
s ] |
sS
n
|W[X
s ] |
sS
2n(H(X
(s)
)+)
(n is suciently large)
sS
2n(H+)
sS
= |S|2n(H+)
2n(H+ )
(n is suciently large).
2
P r{X(s) An } <
(1.2)
log|An
|
n
(1.3)
lim P r{X Bn } = 1.
(1.4)
(1.5)
(1.6)
(s)
))
for any s S,
and therefore
|An | (1 )2n(H) > 2n() 2n(H) = 2n(H) .
(1.7)
suciently large. Then for the source X F with the largest entropy H,
by using the converse part of Shannons source coding theorem, we have
lim P r{X Bn } = 1