Professional Documents
Culture Documents
X5
X3
X2
X1
X7
X6
X4
1
1
1
becomes
0
0
1
1
Homework 1
Page 1 of ??
Now we make an error (or not) in one of the bits (or none). Let Y = X e, where
e is equally likely to be (1, 0, . . . , 0), (0, 1, 0, . . . , 0), . . . , (0, 0, . . . , 0, 1), or (0, 0, . . . , 0),
and e is independent of X.
(b) What is the entropy of Y?
(c) What is H(X|Y)?
(d) What is I(X; Y)?
= H(X).
(c)
H(g(X)).
Thus H(g(X)) H(X).
(b) Show that if Z = g(Y ) then H(X|Y ) H(X|Z).
3. Data Processing Inequality.
If X,Y,Z form a markov triplet (X Y Z), show that:
(a) H(X|Y ) = H(X|Y, Z) and H(Z|Y ) = H(Z|X, Y )
(b) H(X|Y ) H(X|Z)
(c) I(X; Y ) I(X; Z) and I(Y ; Z) I(X; Z)
(d) I(X; Z|Y ) = 0
The following definition may be useful:
Definition 1: The conditional mutual information of random variables X and Y given
Z is defined by
I(X; Y |Z) = H(X|Z) H(X|Y, Z)
X
P (x, y|z)
=
P (x, y, z) log
P (x|z)P (y|z)
x,y,z
Homework 1
Page 2 of ??
r = r/(1 r),
n=1
n=1
1
4
1
4
1
2
0
1 0
Find
(a) H(X), H(Y ).
(b) H(X|Y ), H(Y |X).
(c) H(X, Y ).
(d) I(X; Y ).
Page 3 of ??
H(X2 |X1 )
.
H(X1 )
8. Two looks.
Here is a statement about pairwise independence and joint independence. Let X, Y1 ,
and Y2 be binary random variables. If I(X; Y1 ) = 0 and I(X; Y2 ) = 0, does it follow
that I(X; Y1 , Y2 ) = 0?
(a) Yes or no? Prove or provide a counterexample.
(b) If I(X; Y1 ) = 0 and I(X; Y2 ) = 0 in the above problem, does it follow that
I(Y1 ; Y2 ) = 0?
n
(n)
In words, the typical set A (defined in class) is essentially smallest (on an exponential
scale) among the sets that have non-negligible probability.
Homework 1
Page 4 of ??