You are on page 1of 3

Department of Electrical Engineering

Indian Institute of Technology Jodhpur


EE 321: Contemporary Communication Systems
2016-17 Second Semester (January - May 2017)

Tutorial 2
Source Encoder

Question T2.1
An information source has six possible outputs with probabilities as shown in the table. Codes A, B , C ,
D, E and F , as given in the table, are considered.
1) Which of these codes are uniquely decodable?
2) Which are instantaneous decodable codes?
3) Find the average codeword length L for all the uniquely decodable codes.

Output P (si ) A B C D E F
1
s1 2 000 0 0 0 0 0
1
s2 4 001 01 10 10 10 100
1
s3 16 010 011 110 110 1100 101
1
s4 16 011 0111 1110 1110 1101 110
1
s5 16 100 01111 11110 1101 1110 111
1
s6 16 101 011111 111110 1011 1111 001

Question T2.2
Consider a Huffman code over four symbols, A, B , C , and D. Which of these is a valid Huffman
encoding? Give a brief explanation for your decisions
1) A : 0, B : 11, C : 101, D : 100.
2) A : 1, B : 01, C : 00, D : 010.
3) A : 00, B : 01, C : 110, D : 111.

Question T2.3
A zero-memory source emits seven messages with probabilities 1/2, 1/4, 1/8, 1/16, 1/32, 1/64, and 1/64,
respectively. Obtain the compact binary codes and find the average length of the code word. Determine
the efficiency and the redundancy of the code.

Question T2.4
A zero-memory source emits seven messages with probabilities 1/3, 1/3, 1/9, 1/9, 1/27, 1/27, and 1/27,
respectively. Obtain the compact 3-ary codes and find the average length of the code word. Determine
the efficiency and the redundancy of the code.
Question T2.5
A zero-memory source emits six messages with probabilities 0.3, 0.25, 0.15, 0.13, 0.1, and 0.07, respec-
tively. Find the 4-ary Huffman code. Determine its average word length, the efficiency and the redundancy.

Question T2.6
A zero-memory source emits s1 and s2 with probabilities 0.7 and 0.3, respectively. Find the optimum
(Huffman) binary code for this source as well as its second and third order extensions. Determine the
code efficiencies in each case.

Question T2.7
A source emits three equiprobable messages randomly and independently.
1) Find the source entropy.
2) Find a compact ternary code, the average length of the code word, the code efficiency and the
redundancy.
3) Repeat (2) for a compact binary code.
4) To improve the efficiency of the binary code we now encode the second extension of the source.
Find a compact binary code, the average length of the code word, the code efficiency and the
redundancy.

Question T2.8
Consider an information source with four symbols, A, B, C, D, and the associated symbol probabilities,
pA pB pC pD. Write down a single condition (equation or inequality) that is both necessary
and sufficient to guarantee that, when Huffman constructs the code bearing his name over these symbols,
each symbol will be encoded using exactly two bits. Explain your answer.

Question T2.9
Consider an information source with four symbols, A, B, C, D, and the associated symbol probabilities,
pA pB pC pD. If we assume that Huffman coding is used to compress source data and there
are two different tree constructions with the following properties are possible:
First tree: The length of the longest path from the root (longest codeword) to a symbol is 3.
Second tree: The length of the longest path from the root to a symbol is 2.
Derive one constraint relating the symbol probabilities which ensures feasibility of both tree construc-
tions.
1) Write down the constraint.
2) Show the resulting codes.

Question T2.10
Explain whether the statement is True or False. Recall that a codeword in LZW is an index into the
string table. Suppose the sender adds two strings with corresponding codewords c1 and c2 in that order
to its string/code table. Then, it may transmit c2 for the first time before it transmits c1 .

Question T2.11
Describe the contents of the string table created when encoding a very long string of all as using the
simple version of the LZW encoder. If the decoder has received E encoded symbols (i.e., string table
indices) from the encoder, how many as has it been able to decode?

Question T2.12
Consider the pseudo-code for the LZW decoder given below:
initialize TABLE[0 to 255] = code for individual bytes
CODE = read next code from encoder
STRING = TABLE[CODE]
output STRING
while there are still codes to receive:
CODE = read next code from encoder
if TABLE[CODE] is not defined:
ENTRY = STRING + STRING[0]
else:
ENTRY = TABLE[CODE]
output ENTRY
add STRING+ENTRY[0] to TABLE
STRING = ENTRY
Suppose that this decoder has received the following five codes from the LZW encoder (these are the
first five codes from a longer compression run):
97 index of a in the translation table
98 index of b in the translation table
257 index of second addition to the translation table
256 index of first addition to the translation table
258 index of third addition to the translation table
After it has finished processing the fifth code, what are the entries in TABLE and what is the cumulative
output of the decoder?

You might also like