Professional Documents
Culture Documents
Muhammad Shahzad
School of Electrical Engineering & Computer Science
National University of Sciences & Technology (NUST)
Pakistan
Course Information
Lecture Timings:
Tues: 5:00pm-5:50pm
Wednesday: 6:00pm-7:50pm
Office Hours
Monday: 4:00pm-5:00pm
Wednesday: 4:00pm-5:00pm
Office is located on first floor, A-213
2
Textbook
3
Course Outline
Syllabus
Introduction to Probability Theory
Functions of Random Variables
Limits and Inequalities
Stochastic Processes
Prediction and Estimation
Markov Chains and Processes (time permitting)
Assorted Topics (time permitting)
Grading
Final Exam: 40%
Midterm Exam: 30%
Quizzes: 20%
Homework Assignments: 10%
4
Course Outline
Grading
Final Exam: 40%
Midterm Exam: 30%
Quizzes: 20%
Homework Assignments: 10%
5
Policies
Quizzes can be announced/unannounced and will take place at
the start of Wednesday classes
6
Credits and Acknowledgements
I would like to thank Dr. Ali Khayam for providing these lecture
slides
7
Why this course?
Stochastic theory is an extension of probability theory
8
What will we cover in this lecture?
This lecture is intended to be an introduction to elementary
probability theory
We will cover:
Random Experiments and Random Variables
Axioms of Probability
Mutual Exclusivity
Conditional Probability
Independence
Law of Total Probability
Bayes Theorem
9
Definition of Probability
Probability: [m-w.org]
1 : the quality or state of being probable
2 : something (as an event or circumstance) that is probable
3 a (1) : the ratio of the number of outcomes in an exhaustive
set of equally likely outcomes that produce a given event to
the total number of possible outcomes (2) : the chance that a
given event will occur b : a branch of mathematics concerned
with the study of probabilities
4 : a logical relation between statements such that evidence
confirming one confirms the other to some degree
10
Definition of a Random Experiment
A random experiment comprises of:
A procedure
An outcome
Procedure
(e.g., flipping a coin)
Sample Space
(Set of All Possible
Outcomes)
Outcome
(e.g., the value
observed [head, tail] after
flipping the coin)
11
Definition of a Random Experiment: Outcomes,
Events and the Sample Space
An outcome cannot be further decomposed into other outcomes
{s1 = the value 1}, , {s6 = the value 6}
12
Definition of a Random Experiment: Outcomes,
Events and the Sample Space
s1 s5
s6
s2 s4
s3
13
Definition of a Random Experiment: Outcomes,
Events and the Sample Space
Example of a Random Experiment:
Experiment: Roll a fair die once and record the number of dots on the
top face
S = {1, 2, 3, 4, 5, 6}
A = the outcome is even = {2, 4, 6}
B = the outcome is greater than 4 = {5, 6}
14
Axioms of Probability
Probability of any event A is non-negative:
Pr{A} 0
15
Mutual Exclusivity
For mutually exclusive events A1, A2,, AN, we have:
A1 s1
s5
Find Pr{A1 U A2}
and Pr{A1}+Pr{A2}
s6
s4 in the fair die
A2 s2 example
s3
S
16
Mutual Exclusivity
In general, we have:
Pr{A1 U A2} = Pr{A1} + Pr{A2} Pr{A1 A2}
s1 s5
s6
s2 s4
s3
17
Conditional Probability
Given that event B has already occurred, what is the probability
that event A will occur?
Given that event B has already occurred, reduces the sample
space of A
s1 s5 s1 s5
Event B has
already occurred
s6 => s2, s4, s3 s6
s2 s4 s2 s4
cannot occur
s3 s3
S S
18
Conditional Probability
Given that event B has already occurred, we define a new
conditional sample space that only contains Bs outcomes
The new event space for A is the intersection of A and B: EA|B =
AB
s1 s5 s1 s5
Event B has
already
s6 occurred s6
s2 s4 s2 s4
s3 s3
S S
S|B = {s1, s5, s6}
EA|B= A B = {s6}
19
Conditional Probability
Consider that the example below corresponds to an experiment
where we throw a fair dice and record the number of dots on its
face
For this experiment, what is Pr {A|B} in the example below?
Pr{A|B} = Pr{s6|B}= 1/3
We need to normalize all probabilities in a conditional sample
space with Pr{B}
s1 s5 s1 s5
Event B has
already
s6 occurred s6
s2 s4 s2 s4
s3 s3
S S
S|B = {s1, s5, s6} 20
Conditional Probability
We need to normalize all probabilities in a conditional sample
space with Pr{B}
Pr{s1|B} = Pr{s1}/Pr{B} = (1/6)/(1/2) = 1/3
Pr{s5|B} = Pr{s5}/Pr{B} = (1/6)/(1/2) = 1/3
Pr{s6|B} = Pr{s6}/Pr{B} = (1/6)/(1/2) = 1/3
s1 s5 s1 s5
Event B has
already
s6 occurred s6
s2 s4 s2 s4
s3 s3
S S
S|B = {s1, s5, s6}
21
Conditional Probability
The probability of an event A in the conditional sample space is:
s1 s5 s1 s5
Event B has
already
s6 occurred s6
s2 s4 s2 s4
s3 s3
S S
S|B = {s1, s5, s6} 22
Independence
Two events are independent if they do not provide any
information about each other
Pr {A B } = Pr {A }
In other words, the fact that B has already happened does not
affect the probability of As outcomes
23
Independence
Note that
24
Independence
In general, for n mutually independent events, A1, , An, we
have:
25
Independence: Example
Are events A and C independent?
Assume that all outcomes are equally likely
s2
s6
s4
s5
s1 s3
26
Independence: Example
Are events A and C independent?
Yes: Pr{A C} = Pr{s5} = 1/6
Pr{A}Pr{C} = (3/6)x(2/6) = 1/6
s2
s6
s4
s5
s1 s3
27
Independence: Example
Are events A and B independent?
Assume that all outcomes are equally likely
s2
s6
s4
s5
s1 s3
28
Independence: Example
Are events A and B independent?
NO: Pr{A B} = Pr{s5} = 1/6
Pr{A}Pr{B} = (3/6)x(3/6) = 1/4
s2
s6
s4
s5
s1 s3
S
29
Independence: Example
Are events A and B independent?
Assume that all outcomes are equally likely
A s1
s5
s6
s2 s4
B
s3
30
Independence: Example
Are events A and B independent?
NO: Pr{A B} = Pr{} = 0
Pr{A}Pr{B} = (2/6)x(3/6) = 1/6
Recall that A and B are mutually exclusive
A s1
s5
s6
s2 s4
B
s3
S
31
Example: Mutual Exclusivity and Independence
Experiment: Roll a fair dice twice and record the number of dots on
the top face:
S = { (1,1), (1,2), (1,3), (1,4), (1,5), (1,6),
(2,1), (2,2), (2,3), (2,4), (2,5), (2,6),
(3,1), (3,2), (3,3), (3,4), (3,5), (3,6),
(4,1), (4,2), (4,3), (4,4), (4,5), (4,6),
(5,1), (5,2), (5,3), (5,4), (5,5), (5,6),
(6,1), (6,2), (6,3), (6,4), (6,5), (6,6) }
32
Example: Mutual Exclusivity and Independence
Find the probability of the following events:
A1 = first roll gives an odd number
A2 = second roll gives an odd number
C = the sum of the two rolls is odd
33
Example: Mutual Exclusivity and Independence
Define three events:
A1 = first roll gives an odd number
A2 = second roll gives an odd number
C = the sum of the two rolls is odd
34
Example: Mutual Exclusivity and Independence
A2
35
Example: Mutual Exclusivity and Independence
Pr{A1} = first roll gives an odd number = 18/36 = 1/2
Pr{A2} = second roll gives an odd number = 18/36 = 1/2
C = the sum of the two rolls is odd
36
Example: Mutual Exclusivity and Independence
C = the sum of the two rolls is odd
Let
C1 = first roll is odd and second is even= A1 I A 2
C2 = first roll is even and second is odd= A 1 I A2
C = (A1 I A 2 )U(A 1 I A2 )
= P r {A1 }P r {A 2 }+ P r {A 1 }P r {A2 }
= P r {A1 }(1 - P r {A2 }) + (1 - P r {A1 })P r {A2 }
= 12
37
Four Rules of Thumb from what we have
studied so far
1. Whenever you see two events which have an OR relationship (i.e., event A or
event B), their joint event will be their union, {A U B}
Example: On a binary channel, find the probability of error?
An error occurs when
A: a 0 is transmitted and a 1 is received OR
B: a 1 is transmitted and a 0 is received
Thus probability of error is: Pr{A U B}
Pr{R0|T0}
T0 R0
T1 R1
Pr{R1|T1}
38
Four Rules of Thumb from what we have
studied so far
2. Whenever you see two events which have an AND relationship (i.e., both
event A and event B), their joint event will be their intersection, {A B}
Example: On a binary channel, find the probability that a 0 is transmitted and a
1 is received?
An error occurs when
A: a 0 is transmitted AND
B: a 1 is received
Thus probability of above event is: Pr{A B}
Pr{R0|T0}
T0 R0
T1 R1
Pr{R1|T1}
39
Four Rules of Thumb from what we have
studied so far
3. Whenever you see two events which have an OR relationship (i.e., A U B),
check if they are mutually exclusive. If so, set Pr{A U B} = Pr{A} + Pr{B}
Example: On a binary channel, find the probability of error?
An error occurs when
A: a 0 is transmitted and a 1 is received OR
B: a 1 is transmitted and a 0 is received
Thus probability of error is: Pr{error} = Pr{A U B}
Are A and B mutually exclusive?
Pr{R0|T0}
T0 R0
T1 R1
Pr{R1|T1}
40
Four Rules of Thumb from what we have
studied so far
3. Whenever you see two events which have an OR relationship (i.e., A U B), check if
they are mutually exclusive. If so, set Pr{A U B} = Pr{A} + Pr{B}
Example: On a binary channel, find the probability of error?
An error occurs when
A: a 0 is transmitted and a 1 is received OR
B: a 1 is transmitted and a 0 is received
Thus probability of error is: Pr{error} = Pr{A U B}
YES!
A and B are mutually exclusive; transmission of a 0 precludes the possibility of
transmission of a 1, and vice versa. Therefore, we can set
Pr{error} = Pr{A U B} = Pr{A} + Pr{B}
Pr{R0|T0}
T0 R0
T1 R1
Pr{R1|T1}
41
Four Rules of Thumb from what we have
studied so far
4. Whenever you see two events which have an AND relationship (i.e., A B), check if
they are independent. If so, set Pr{A B} = Pr{A}Pr{B}
Example: On a binary channel, find the probability that a 0 is transmitted and a 1 is
received?
A: a 0 is transmitted AND
B: a 1 is received
Probability of above event is: Pr{A B}
Are A and B independent?
Pr{R0|T0}
T0 R0
T1 R1
Pr{R1|T1}
42
Four Rules of Thumb from what we have
studied so far
4. Whenever you see two events which have an AND relationship (i.e., A B), check if
they are independent. If so, set Pr{A B} = Pr{A}Pr{B}
Example: On a binary channel, find the probability that a 0 is transmitted and a 1 is
received?
A: a 1 is received AND
B: a 0 is transmitted
Probability of above event is: Pr{A B}
Are A and B independent?
NO! (See Appendix of this lecture for a more detailed explanation)
Pr{A|B}=Pr{R1|T0} Pr{A}=Pr{R1}
Pr{R0|T0}
T0 R0
T1 R1
Pr{R1|T1}
43
Four Rules of Thumb from what we have
studied so far
4. Whenever you see two events which have an AND relationship (i.e., A B), check if
they are independent. If so, set Pr{A B} = Pr{A}Pr{B}
Example 2: On a binary channel, find the probability that a 0 is transmitted and a 1 is
received?
A: at time n+1, a 1 is received when a 0 is transmitted AND
B: at time n, a 0 is received when a 1 is transmitted
Probability of above event is: Pr{A B}
Are A and B independent?
Pr{R0|T0}
T0 R0
T1 R1
Pr{R1|T1}
44
Four Rules of Thumb from what we have
studied so far
4. Whenever you see two events which have an AND relationship (i.e., A B), check if
they are independent. If so, set Pr{A B} = Pr{A}Pr{B}
Example 2: On a binary channel, find the probability that a 0 is transmitted and a 1 is
received?
A: at time n+1, a 1 is received when a 0 is transmitted AND
B: at time n, a 0 is received when a 1 is transmitted
Probability of above event is: Pr{A B}
Are A and B independent?
YES!
Pr{A|B}=Pr{R1|T0}=Pr{A}
=> Pr{A B} = Pr{A}Pr{B} = Pr{R1|T0} Pr{R0|T1}
Pr{R0|T0}
T0 R0
T1 R1
Pr{R1|T1}
45
Partition of a Sample Space
B1, B2,, BN form a partition of a sample space we have:
S = B 1 U B2 U U BN
Bi Bj = , i j
B2
s1 s5
B3
B1 B4 s6
s4
s2
s3
46
Total Probability
If B1, B2,, BN form a partition then for any event A
(A Bi) (A Bj) = , i j
=> A = (A B1) U (A B2) U U (A BN)
B2
s1 s5
A A s6
B1
B4 s4
s2
B3
s3
47
Total Probability
Thus event A can be expressed as the union of mutually exclusive
events:
A = (A B1) U (A B2) U U (A BN)
=> Pr{A} = Pr{A B1} + Pr{A B2} + + Pr{A BN}
B2
s1 s5
A A s6
B1
B4 s4
s2
B3
s3
48
Total Probability
If B1, B2,, BN form a partition then for any event A:
Pr{A} = Pr{A B1} + Pr{A B2} + + Pr{A BN}
B2
s1 s5
A A s6
B1
B4 s4
s2
B3
s3
49
Total Probability
Using the definition of conditional probability:
Pr{A| Bi} = Pr{A Bi} / Pr{Bi}
=> Pr{A Bi} = Pr{A| Bi} Pr{Bi}
B2
s1 s5
A A s6
B1
B4 s4
s2
B3
s3
50
The Law of Total Probability
The Law of Total Probability states:
If B1, B2,, BN form a partition then for any event A
Pr{A} = Pr{A|B1} Pr{B1} + Pr{A|B2} Pr{B2} + + Pr{A|BN} Pr{BN}
B2
s1 s5
A A s6
B1
B4 s4
s2
B3
s3
51
Bayes Theorem
Based on the Law of Total Probability, Thomas Bayes decided to
look at the probability of a partition given a particular event, the
so-called inverse probability
B2
s1 s5
A A s6
B1
B4 s4
s2
B3
s3
52
Bayes Theorem
Based on the Law of Total Probability, Thomas Bayes decided to
look at the probability of a partition given a particular event, the
so-called inverse probability
Pr{Bi|A} = Pr{A Bi} / Pr{A}
Since Pr{A Bi} = Pr{A|Bi} Pr{Bi}, we obtain
Pr{Bi|A} = Pr{A|Bi} Pr{Bi} / Pr{A}
B2
s1 s5
A A s6
B1
B4 s4
s2
B3
s3
53
Bayes Theorem
Pr{Bi|A} = Pr{A|Bi} Pr{Bi} / Pr{A}
From the Law of Total Probability, we have:
Pr{A} = Pr{A|B1} Pr{B1} + Pr{A|B2} Pr{B2} + + Pr{A|BN} Pr{BN}
Bayes Rule
B2
s1 s5
A A s6
B1
B4 s4
s2
B3
s3
54
Bayes Theorem
B1, B2,, BN are known as a priori events
these events are known before the experiment
Pr{Bi} is known as a priori probability
Pr{Bi|A} is known as a posteriori probability
Experiment is performed; Event A is observed; now what is the
probability that Bi has occurred?
B2
s1 s5
A A s6
B1
B4 s4
s2
B3
s3
55
Bayes Theorem: Example
Bayes Theorem is best understood through a classical example
of a memory-less binary channel shown below
What we already know about this channel is:
A priori probabilities: Pr{T0}, Pr{T1}
Channel probabilities: Pr{R0|T0}, Pr{R1|T0} = 1 - Pr{R0|T0}, Pr{R1|T1},
Pr{R0|T1} = 1 - Pr{R1|T1}
Pr{R0|T0}
T0 R0
T1 R1
Pr{R1|T1}
56
Bayes Theorem: Example
Given that we know Pr{T0} and Pr{T1}, we want to find:
Pr{Ti|Ri}: The probability that Ti was transmitted given that Ri has been
received, i = 0, 1; or the probability of successful symbol transmission
Pr{Ti|Rj}: The probability that Ti was transmitted given that Rj has been
received, i j: or the probability of symbol error
Pr{R0|T0}
T0 R0
T1 R1
Pr{R1|T1}
57
Bayes Theorem: Example
Pr{correct transmission}: The probability that Ti was transmitted
given that Ri has been received, i = 0, 1; or the probability of
successful symbol transmission
Pr{correct transmission} = Pr{ (T0 R0) U (T1 R1) }
= Pr{T0|R0} Pr{R0} + Pr{T1|R1} Pr{R1}
Pr{R0|T0}
T0 R0
T1 R1
Pr{R1|T1} 58
Bayes Theorem: Example
Lets first focus on finding Pr{T0|R0}
From Bayes Rule, we know that
Pr{T0|R0} = Pr{R0|T0} Pr{T0} / Pr{R0}
Pr{R0|T0}
T0 R0
T1 R1
Pr{R1|T1}
59
Bayes Theorem: Example
From Bayes Theorem, we know that
Pr{T0|R0} = Pr{R0|T0} Pr{T0} / Pr{R0}
Pr{R0|T0}
T0 R0
T1 R1
Pr{R1|T1} 60
Bayes Theorem: Example
Lets now focus on finding Pr{R0} in terms of what we already
know
From the Law of Total Probability, we have
Pr{R0} = Pr{R0|T0} Pr{T0} + Pr{R0|T1} Pr{T1}
Pr{R0|T0}
T0 R0
T1 R1
Pr{R1|T1} 61
Bayes Theorem: Example
Lets now focus on finding Pr{R0} in terms of what we already
know
From the Law of Total Probability, we have
Pr{R0} = Pr{R0|T0} Pr{T0} + Pr{R0|T1} Pr{T1}
Pr{R0|T0}
T0 R0
T1 R1
Pr{R1|T1} 62
Bayes Theorem: Example
Pr{R0|T0}
T0 R0
T1 R1
Pr{R1|T1} 63
Bayes Theorem: Example
Pr{T0|R0} = Pr{R0|T0} Pr{T0} / (Pr{R0|T0} Pr{T0} + Pr{R0|T1} Pr{T1})
Pr{R0|T0}
T0 R0
T1 R1
Pr{R1|T1} 64
Bayes Theorem: Example
Pr{R0|T0}
T0 R0
T1 R1
Pr{R1|T1} 65
Bayes Theorem: Example
A Priori Probabilities
Pr{T0} = 0.45
Pr{T1} = 1 - Pr{T0} = 0.55
Channel probabilities:
Pr{R0|T0} = 0.94
Pr{R1|T0} = 1 - Pr{R0|T0} = 0.06
Pr{R1|T1} = 0.91
Pr{R0|T1} = 1 - Pr{R1|T1} = 0.09
Pr{R0|T0}=0.94
T0 R0
T1 R1
Pr{R1|T1}=0.91 66
Bayes Theorem: Example
Then:
Pr{T0|R0} = Pr{R0|T0} Pr{T0} / (Pr{R0|T0} Pr{T0} + Pr{R0|T1} Pr{T1})
=0.94x0.45 / (0.94x0.45 + 0.09x0.55) = 0.8952
Pr{T1|R1} = Pr{R1|T1} Pr{T1} / (Pr{R1|T1} Pr{T1} + Pr{R1|T0} Pr{T0})
=0.91x0.55 / (0.91x0.55 + 0.06x0.45) = 0.9488
Pr{R0|T0}=0.94
T0 R0
T1 R1
Pr{R1|T1}=0.91
67
Lecture 1: Appendix A
68
Background on a Memoryless Binary
Communication Channel
Memoryless: Bit transmission at time n+i, i>0 has no dependence on bit
transmission at time n
Binary: Only two symbols are transmitted, represented by T0 and T1
Prior Probabilities: Probabilities of T0 and T1 are calculated ahead of time
from the data; Pr{T1}= 1 - Pr{T0}
Crossover Probabilities: Pr{R0|T1} and Pr{R1|T0} are called crossover or bit-
error probabilities. These probabilities are also calculated ahead of time
by sending training signals on the channel; Pr{R0|T0}=1-Pr{R1|T0},
Pr{R0|T1}=1-Pr{R1|T1}
Pr{R0|T0}
T0 R0
T1 R1
Pr{R1|T1}
69
Example 1
On a binary channel, find the probability that a 0 is transmitted
and a 1 is received?
A: a 1 is received AND
B: a 0 is transmitted
Probability of above event is: Pr{A B}
Are A and B independent?
Pr{R0|T0}
T0 R0
T1 R1
Pr{R1|T1}
70
Example 1
What is the sample space of our experiment?
Pr{R0|T0}
T0 R0
T1 R1
Pr{R1|T1}
71
Example 1
What is the sample space of our experiment?
Sample Space, S = {(T0 R0), (T0 R1), (T1 R0), (T1 R1)}
Pr{R0|T0}
T0 R0
T1 R1
Pr{R1|T1}
72
Example 1
A: a 1 is received AND B: a 0 is transmitted
Are A and B independent?
Pr{R0|T0}
T0 R0
T1 R1
Pr{R1|T1}
73
Example 1
A: a 1 is received AND B: a 0 is transmitted
Is Pr{A B} = Pr{A}Pr{B} ?
Pr{R0|T0}
T0 R0
T1 R1
Pr{R1|T1}
74
Example 1
A: a 1 is received AND B: a 0 is transmitted
Is Pr{R1|T0}Pr{T0} = Pr{R1}Pr{T0} ?
We can get rid of Pr{T0} from both sides, so we are left with
the following question:
Is Pr{R1|T0} = Pr{R1} ?
Pr{R0|T0}
T0 R0
T1 R1
Pr{R1|T1}
75
Example 1
A: a 1 is received AND B: a 0 is transmitted
Is Pr{R1|T0} = Pr{R1} ?
Pr{R0|T0}
T0 R0
T1 R1
Pr{R1|T1}
76
Example 1
A: a 1 is received AND B: a 0 is transmitted
Is Pr{R1|T0} = Pr{R1} ?
It can be intuitively deduced that the above equality relation does not hold in
general. Mathematically, we can show this by computing the Pr{R1}:
Pr{R1} = Pr{ (T0 is txd AND R1 is recd) OR (T1 is txd AND R1 is recd)}
Pr{R1} = Pr{ (T0 R1) U (T1 R1)}
Pr{R0|T0}
T0 R0
T1 R1
Pr{R1|T1}
77
Example 1
A: a 1 is received AND B: a 0 is transmitted
Is Pr{R1|T0} = Pr{R1} ?
Pr{R0|T0}
T0 R0
T1 R1
Pr{R1|T1}
78
Example 1
A: a 1 is received AND B: a 0 is transmitted
Is Pr{R1|T0} = Pr{R1|T0}Pr{T0} + Pr{R1|T1}Pr{T1} ?
Pr{R0|T0}
T0 R0
T1 R1
Pr{R1|T1}
79
Example 1
A: a 1 is received AND B: a 0 is transmitted
Final Result: Pr{T0} = 1 => A and B are independent
Pr{R0|T0}
T0 R0
T1 R1
Pr{R1|T1}
80