You are on page 1of 2

Roll No.

Bipin Tripathi Kumaon Institute of Technology, Dwarahat


M.Tech. (2nd semester), Digital Communication
Session: 2015-2016
Coding for Reliable Communication (DC-101)

Time: 3Hours
Q1- Attempt any four of the following:

MM: 100
(5x4=20)

(a)Why are fixed length codes inefficient for alphabets whose letters are not equiprobable? Discuss
this in relation to Morse Code.
(b) What is the shortest possible code length, in bits per average symbol, that could be achieved for a
six-letter alphabet whose symbols have the following probability distribution? { 1/ 2 , 1 /4 , 1/ 8 , 1/
16 , 1/ 32 , 1/ 32 }.
(c) Calculate the probability that if somebody is tall, that person must be male. Assume that the
probability of being male is p(M) = 0.5 and so likewise for being female p(F) = 0.5. Suppose that
20% of males are tall; and that 6% of females are tall. If you know that somebody is male, how much
information do you gain (in bits) by learning that he is also tall? How much do you gain by learning
that a female is tall?
(d) Consider a noisy analog communication channel of bandwidth = 1 MHz, which is perturbed by
additive white Gaussian noise whose total spectral power is N0 = 1. Continuous signals are
transmitted across such a channel, with average transmitted power P = 1,000. Give a numerical
estimate for the channel capacity, in bits per second, of this noisy channel. Then, for a channel
having the same bandwidth but whose signal-to-noise ratio P/ N0 is four times better, repeat your
numerical estimate of capacity in bits per second.
(e) A 12-bit Hamming codeword 101110000110 containing 8 bit of data is received from a noisy
channel. What was the original number if even parity is used.
Q2- Attempt any two of the following:

(10x2=20)

(a) Consider two independent integer-valued random variables, X and Y. Variable X takes on only
the values of the eight integers {1, 2, . . . , 8} and does so with uniform probability. Variable Y
may take the value of any positive integer k, with probabilities P{Y = k} = 2-k , k = 1, 2, 3, . . . .
(i) Which random variable has greater uncertainty? Calculate both entropies H(X) and H(Y ) (ii)
What is the joint entropy H(X,Y) of these random variables, and what is their mutual information
I(X; Y )?

(b) Consider a binary symmetric communication channel(fig. 1), whose input source is the alphabet
X = {0, 1} with probabilities {0.5, 0.5}; whose output alphabet is Y = {0, 1}; 1. What is the
entropy of the source, H(X)? 2. What is the probability distribution of the outputs, p(Y ), and the
entropy of this output distribution, H(Y )? 3. What is the joint probability distribution for the
source and the output, p(X, Y ), and what is the joint entropy, H(X, Y )? 4. What is the mutual
information of this channel, I(X; Y )?
(c) Two binary channels are connected in cascade, as shown in fig 2 ; 1. Find the overall channel
matrix of the resultant channel and draw the resultant equivalent channel diagram. 2. Find P(z 1)
and P(z2) when P(x1)= P(x2)=0.5.

Q3- Attempt any two of the following:

(10x2=20)

(a) A DMS X has five symbols with probabilities 0.4, 0.19, 0.16, 0.15, 0.1 . Construct a ShannonFano code for X, and calculate the efficiency of the code. Construct another Shannon-Fano code
and compare the results. Repeat for Huffman code and compare the results.
(b) Consider Shannons third theorem, the Channel Capacity Theorem, for a continuous
communication channel having bandwidth W Hertz, perturbed by additive white Gaussian noise
of power spectral density N0, and average transmitted power P. 1. Is there any limit to the
capacity of such a channel if you increase its signal-to-noise ratio P/ N0W without limit? If so,
what is that limit? 2. Is there any limit to the capacity of such a channel if you can increase its
bandwidth W in Hertz without limit, but while not changing N0 or P? If so, what is that limit?
(c) Consider a (7,4) linear block code with the parity check matrix H given, Construct code words
for this (7,4) , find the minimum hamming distance and show GHT=0.

Q4- Attempt any two of the following:

(10x2=20)

(a) Show that the code C = {0000, 0101, 1010, 1111} is a linear cyclic code. Find the generator
polynomial g(x) for C and show that every code polynomial is a multiple of g(x).
(b) Explain Markov statistical model for information source.
(c) Define Entropy and types of Entropy. Give relation between them. Draw the entropy curve for
binary memory less source.
Q5- Write short note on any four of the following:

(5x4=20)

(a) Measure of information

(b) Burst error correcting code

(c) Special Channels

(d) Differential Entropy

(e) Code Tree and Trellis diagram

You might also like