You are on page 1of 3

EEL710_2013 (Prof. R.

Bose)

TUTORIAL 1

07/08/2013

# Date & time of submission: 23/08/2013 (1:00 AM).


# Solutions to practice problems are not to be submitted but they cover some important topics.

Problem 1: (Inequalities).Let X, Y, and Z be joint random variables. Prove the following


inequalities and find conditions for equality.
(a) H(X, Y| Z) H(X | Z).
(b) I (X,Y; Z) I (X ; Z).
(c) H(X,Y,Z) H(X,Y) H(X,Z) H(X).
(d) I (X;Z|Y) I (Z; Y|X) I (Z; Y) + I (X; Z).
*As an application of some of the inequalities of information theory, consider an additive noise
fading channel Y = XV + Z, where Z is additive noise, V is a random variable representing
fading, and Z and V are independent of each other and of X. You can argue that the knowledge
of the fading factor V improves capacity by showing that I X;Y V I X;Y .
Problem 2: (Mutual Information).A channel has an input ensemble X consisting of the
numbers +1 and -1 used with the probabilities PX (+1) = PX (-1) = 1/2. The output y is the sum
of the input x and an independent noise random variable Z with the probability density pZ(z) =
1/4 for 2z2 and pZ(z) = 0 elsewhere. In other words, the conditional probability density of
y conditioned on x is given by PY|X (y|x) = 1/4 for -2 y-x 2 and PY|X (y |x) = 0 elsewhere.
(a) Find and sketch the output probability density for the channel.
(b) Find I(X ; Y).
(c) Suppose the output is transformed into a discrete processed output u defined by u = 1 for y >
1; u = 0 for -1 y 1; u=-1 for y 1. Find I(X ; U).
Problem 3: (Capacity_DMC).Find the capacity and an optimizing input probability assignment
for the DMCs below (represented in graphical and mathematical form)

1p
DMC 2= 2
p
2

DMC 3=

1 p
2
p
2

p
2
1 p
2

1
0

0
0
1

p
2
1 p
2

DMC 1
Problem 4: (Capacity_Gaussian Channel).Consider a Gaussian noise channel shown in figure
below, as with power constraint P, where the signal takes two different paths and the received
noisy signals are added together at the antenna.
Page 1 tacodingtheory@gmail.com

TUTORIAL 1

EEL710_2013 (Prof. R. Bose)

07/08/2013

(a) Find the capacity of this channel if Z1 and Z2 are jointly normal with covariance matrix

K z=
(b) What

2
2

2
2

is the capacity for = 0, = 1, = 1?

Problem 5: (Entropy) Information Theory, Coding and Cryptography by R. Bose, problem no


2.10 (i) & (ii).
Problem 6: (Information Conveyed) A single unbiased die is tossed once. If the face of the die
is 1,2,3, or 4, an unbiased coin is tossed once. If the face of the die is 5 or 6, the coin is tossed
twice. Find the information conveyed about the face of the die by the number of heads obtained.
Problem 7: (Exponential Noise Channel) Y i =X i +Zi , where Z i is i.i.d exponentially distributed
noise with mean .Assume that we have a mean constraint on the signal (i.e., EX i ). Show
that the capacity of such a channel is

C= log 1

Problem 8: (Huffman code) A DMS source outputs the following eight symbols {a, b, c, d, e,
f, g, h} with respective probabilities {0.3, 0.2, 0.15, 0.1, 0.1, 0.08, 0.05, 0.02}. Determine the
Huffman code and efficiency by taking two and three symbols at a time. (* For the definition of
efficiency refer to Pg 22, Information Theory, Coding and Cryptography by R.Bose)

Practice Problems (PP)


PP1:(Z-Channel) Information Theory, Coding and Cryptography by R.Bose, problem no 2.8.
PP2:(Lempel-Ziv) Information Theory, Coding and Cryptography by R.Bose, problem no 1.20.
PP3: Let X be an ensemble of M points a1, ... , aM and let PX(aM) = : Show that

Page 2 tacodingtheory@gmail.com

TUTORIAL 1

EEL710_2013 (Prof. R. Bose)

07/08/2013

1
1
H X = log 1 log
1 H Y

where Y is an ensemble of M - 1 points a1 ... , aM-1 with probabilities PY(aj) = PX(aj)/(1-) ;


1 jM 1 . Show that
1
1
H X log 1 log
1 log M 1

and determine the condition for equality.


PP4: A DMC is called additive modulo K if it has the input and output alphabet [0, 1...., K-1]
and the output y is given in terms of the input x and a noise variable Z by y=xz . The noise
variable takes on values [0,1....,K-1] and is statistically independent of the input and the
addition xz is taken modulo K (i.e., as x+z or x+z-K, whichever is between 0 and K-1).
(a) Show that I(X; Y) = H(Y) - H (Z).
(b) Find the capacity in terms of H(Z) and find the maximizing input probability assignment.
PP5:(Band-Limited Channel) Information Theory, Coding and Cryptography by R.Bose,
problem no 2.5.
PP6:(Entropy of functions of a random variable) Let X be a discrete random variable. Show
that the entropy of a function of X is less than or equal to the entropy of X by justifying the
following steps:
H X,g X =H X +H g X X =H X
H X,g X =H g X +H Xg X H g X
Thus H g X H X

Page 3 tacodingtheory@gmail.com

You might also like