Professional Documents
Culture Documents
(a) Define Entropy and types of Entropy. Give relation between them. Draw the entropy curve
for binary memory less source.
(b) What is Mutual Information? Discuss the properties of Mutual Information?
(c) Two binary channels are connected in cascade, as shown in fig 2; 1. Find the overall
channel matrix of the resultant channel and draw the resultant equivalent channel diagram.
2. Find P (z1) and P(z2) when P(x1)=0.6, P(x2)=0.4.
(a) A DMS X has five symbols with probabilities 0.4, 0.19, 0.16, 0.15, 0.1. Construct a
Shannon-Fano code for X, and calculate the efficiency of the code. Construct another
Shannon-Fano code and compare the results. Repeat for Huffman code and compare the
results.
(b) Discuss Shannons third theorem. Find the channel capacity of an ideal AWGN channel
with infinite bandwidth.
(c) The parity check matrix of a particular (7,4) linear block code is given by
1 1 1 0 1 0 0
[H]=[1 1 0 1 0 1 0]
1 0 1 1 0 0 1
i. Find the generator matrix.
ii. List all the code vectors.
iii. What is the minimum distance between the code words?
iv. How many errors can be detected? How many errors can be corrected?
(a) Show that the code C = {0000, 0101, 1010, 1111} is a linear cyclic code. Find the generator
polynomial g(x) for C and show that every code polynomial is a multiple of g(x).
(b) Explain Markov statistical model for information source.
(c) Find the channel capacity of the binary erasure channel.
(e) Companding