Professional Documents
Culture Documents
Information Theory
Shriram Nandakumar
Department of Electronics & Communication,
Amrita School of Engineering,
Amrita Vishwa Vidyapeetham University,
Amritapuri Campus.
Thanks to.
Gilbert Strang- My role model in teaching.
Accessible to all in the Internet through his
video lectures!!
Shri. R.Srinivasa Varadhan- Mathematics teacher.
Shri. R.Varadhan- Chemistry teacher.
Mrs. V.Uma Maheswari- Introduced me to the world of
Signal Processing in a lucid way.
Prof. Dimitris.A.Pados- Taught me Information Theory.
Dr. Sundararaman Gopalan- Inspires me on how to stay simple
& be rational.
Dr. Nithin Nagaraj , my role model for doing good research.
Finally.. My Parents!
PROBABILITY PRIMER
Conditional Probability
How to view Conditional Probability??
Independence of Events
Independent Events
Important view- Conditioning of an event A
over some other event B doesnt alter P(A).
Probability Primer
Independence DOESNT mean that
one event has no effect on the other(s)
Independence of Events
Independence of 3 events (Can be extended to
N events)
4th condition
not satisfied?A,B,C are
pair-wise
independent
Bayes theorem
How to view Bayes Theorem?
Please start viewing it a tool for INFERENCE.
Assess the validity of an event when some other
event has been observed.
Assess your hypothesis in light of new evidence.
Prior Probability
Posterior Probability
Variance:
Random variables:
X- The coin we pick {0,1}
Y- Number of heads {0,1,.N}.
Joint Distribution
0.14
0.12
0.1
0.08
0.06
0.04
0.02
0
1
0.35
0.3
0.25
0.2
0.15
0.1
0.05
0.5
1.5
2.5
Conditionals of Y (conditioned on X)
0.5
0.4
0.3
0.2
0.1
0.5
1.5
2.5
0.5
1.5
2.5
0.5
0.4
0.3
0.2
0.1
Joint PMF
0.35
0.3
0.25
0.2
0.15
0.1
0.05
0
1
3
2
1
0
0.3
0.25
0.2
0.15
0.1
0.05
0.5
1.5
2.5
Conditional of Y (conditioned on X)
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0
0.5
1.5
2.5
0.5
1.5
2.5
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0
Joint PMF
0.4
0.35
0.3
0.25
0.2
0.15
0.1
0.05
0
1
3
2
1
0
0.35
Comparison of conditionals
0.3
0.25
0.2
0.15
0.1
0.05
0.5
1.5
2.5
0.5
1.5
2.5
0.4
0.35
0.3
0.25
0.2
0.15
0.1
0.05
0.14
0.12
0.1
0.08
0.06
0.04
10
8
0.02
6
0
4
4
3.5
2.5
2
2
1.5
0.5
..This?
How to find if X is uniform? Are biases
same?
0.35
0.3
0.25
0.2
0.15
0.1
10
0.05
8
6
0
4
4
3
2
2
A solidifying example!
Application Example
Coding Theory (A Simple ErrorCorrecting code for Binary Symmetric
Channel)
Repetition Codes
Posterior
Prior
Guess
if
Guess
if vice-versa.
For equally likely hypothesis of
&
Maximizing A posterior Probability is equivalent to
maximizing the likelihood
MVD
, Pr(
.
)= Prob of 3 flips + Prob of 2 flips
(Less likely)
=
(More likely)
+
Repetition Codes
Rate vs Error Probability
2 Important Properties
Why
Redundancy
Measures the fractional difference between
and its maximum possible value
Redundancy =
What is the redundancy in fair-coin toss??
Joint Entropy
Conditional Entropy
Quick Example:
You have 3 coins. One is fair, the 2nd is double-headed
& 3rd is double-tailed. You are blind-folded & made to
pick a coin and toss it. You look at one its faces & see a
head. Does seeing one of the faces reduce the uncertainty
regarding the coin you have picked? Evaluate all entropies.
Mutual Information
Various Relationships
References
A mathematical Theory of Communication
Claude.E.Shannons 1948 classic paper.
Information Theory, Inference & Learning Algorithms
David J.C. Mackay, Cambridge University Press.
Intuitive Probability and Random Processes using
Matlab, Steven.M.Kay, Springer.
A light Discussion and Derivation of Entropy, Tutorial
Paper, Jonathan Shlens, Systems Neurobiology
Laboratory, Salk Institute for Biological Studies.