You are on page 1of 32

Dr.

Omer Mohsin Mubarak


Assistant Professor
Department of Electronics Engineering
Iqra University, Islamabad
omer.mubarak@iqraisb.edu.pk
EEN 321 Digital Communications
Probability Probability Probability Probability
EEN 321 Digital Communications
Fall 2013
3
Probability
Sample space S is defined as set of all possible outcomes in
an experiment
Sample space may be finite or infinite and may be discrete
or continuous set
An event A is defined as a subset of S, if all elements of
event A is from S
Complement of A consists of all elements of S, which are
not part of A:
A S A
c
=
EEN 321 Digital Communications
Fall 2013
4
Probability
Two events are said to be mutually exclusive if they do not
have any common element
If events A and B are mutually exclusive, B can also said to
be a subset of complement of A
Consider tossing of a dice experiment with S={1, 2, 3, 4,
5, 6} and event A as consisting of even outcomes A={2, 4,
6}
If the dice is fair, i.e., probability of getting any number is
same (1/6), probability of A is given as 3*(1/6), i.e.
5 . 0 ) ( = A P
EEN 321 Digital Communications
Fall 2013
5
Probability
For range of events A
1
, A
2
, A
3
, and event A given as
union of all these events, i.e.,
Probability of A can then be given as:
Maximum probability of A in this case is sum of probability
of all events A
i
this will be the case when there is no
common element among all A
i
s or in other words A
i
and
A
j
are mutually exclusive for all i j
U
i
i
A A =

i
i
A P A P ) ( ) (
EEN 321 Digital Communications
Fall 2013
6
Joint probability
Consider two experiments with sample spaces as: S
1
={x
1
,
x
2
, x
3
x
M
}and S
2
={y
1
, y
2
, y
3
y
N
}
Probability of occurring of a specific set of events x
i
and y
j
is denoted as P(x
i
,y
j
), where
Marginal probabilities are used to remove effect of other
event(s) from joint probability
If all events in S
2
are mutually exclusive, marginal
probabilities for x
i
is given as:

=
=
N
j
j i i
y x P x P
1
) , ( ) (
1 ) , ( 0
j i
y x P
EEN 321 Digital Communications
Fall 2013
7
Joint probability
If all events in S
1
are mutually exclusive, marginal
probabilities for y
j
is given as:
If all outcomes of S
1
and S
2
are mutually exclusive,

=
=
M
i
j i j
y x P y P
1
) , ( ) (
1 ) , (
1 1
=

= =
N
j
M
i
j i
y x P
EEN 321 Digital Communications
Fall 2013
8
Example
Let x and y be two random variables with joint probability
function f(x,y)
Joint and marginal probabilities are shown in table below
EEN 321 Digital Communications
Fall 2013
9
Conditional probability
Probability of occurrence of an event A given that another
event B has occurred is known as conditional probability
It is mathematically given as ratio of joint probability over
marginal probability, i.e.,
Comparing P(A|B) and
Conditional probability can also be given as:
) (
) , (
) | (
B P
B A P
B A P =
) (
) , (
) | (
A P
B A P
A B P =
) (
) | ( ) (
) | (
A P
B A P B P
A B P =
EEN 321 Digital Communications
Fall 2013
10
Conditional probability
Joint probability can then also be given as:
That is, probability that both A and B occurs is given as
probability of occurrence of B and probability of
occurrence of A given B has already occurred
Similarly, joint probabilities for three events is given as:
) ( ) | ( ) , ( B P B A P B A P =
) ( ) | ( ) , | ( ) , , ( B P B A P B A C P C B A P =
EEN 321 Digital Communications
Fall 2013
11
Chain rule
Consider n events A
1
, A
2
, A
3
,A
4
A
n
Joint probability of all of these is given as:
This is called chain rule

=


=
=
n
i
i i i
n n n
n
A A A A P
A A A A P A A A P A A P A P
A A A A A P
1
1 2 1
1 2 1 2 1 3 1 2 1
4 3 2 1
) .... , | (
) .... , | ( )....... , | ( ) | ( ) (
) ..... , , , (
EEN 321 Digital Communications
Fall 2013
12
Example
Consider joint probabilities as:
Conditional probability of x=3 given y=1 is:
375 . 0
80 . 0
30 . 0
) 1 (
) 1 , 3 (
) 1 | 3 ( = =
=
= =
= = =
y P
y x P
y x P
EEN 321 Digital Communications
Fall 2013
13
Statistical independence
Two events A and B are said to be independent if
occurrence of one event does not change probability of the
other:
This also implies:
Using chain rule, for n independent events
) ( ) | ( A P B A P =
) ( ) ( ) , ( B P A P B A P =

=
=
n
i
i n
A P A A A A A P
1
4 3 2 1
) ( ) ..... , , , (
Random variable Random variable Random variable Random variable
EEN 321 Digital Communications
Fall 2013
15
Random variable
A sample space set can be mapped to a set of real
numbers, i.e., every outcome can be associated with a real
number
Such a mapping is called random variable
A random variable is a rule that assigns a numerical value
to each possible outcome of a probabilistic experiment
For example, in coin tossing sample space is {Head, Tail}
They can be mapped to a random variable x, such that head
is mapped to 1 and tail is mapped to -1
EEN 321 Digital Communications
Fall 2013
16
Random variable
X={-1,1} this is an example of discrete random
variable, which can only take one of pre-defined values
Continuous random variable can take any value within a
specific range
Consider a discrete random variable X={x
1
,x
2
,x
3
.x
n
},
then corresponding probabilities for each value is given as
p
1
,p
2
,p
3
.p
n
such that:
1
..... 2 , 1 1 0
1
=
=

=
n
i
i
i
p
n i for p
EEN 321 Digital Communications
Fall 2013
17
Cumulative distribution function
Cumulative distribution function (CDF) F
X
(x) of random
variable X is defined as probability of random variable
being smaller than x
It is a non-decreasing continuous function of x
Also known as probability distribution
In case of coin tossing example, X={-1,1} with
probability of getting either of these two values as 0.5
CDF in this case is:
) ( ) ( x X P x F
X
=

<
<
=
1 1
1 1 5 . 0
1 0
) (
x for
x for
x for
x F
X
EEN 321 Digital Communications
Fall 2013
18
Probability density function (PDF)
CDF of discrete random variable increases in steps and
that of continuous random variable usually increases
smoothly
Derivative of CDF is called as probability density function
(PDF) p
X
(x)
Thus, CDF can also be given as integral (or area under tha
curve) of PDF from to x
dx
x dF
x p
X
X
) (
) ( =


=
x
X X
dx x p x F ) ( ) (
EEN 321 Digital Communications
Fall 2013
19
Probability density function (PDF)
Probability of a random
variable to exist between two
values x
1
and x
2
is given in
terms of PDF and CDF as:
) ( ) (
) ( ) (
1 2
2 1
2
1
x F x F
dy y p x X x P
x x
x
x
X
=
=

EEN 321 Digital Communications
Fall 2013
20
Uniform density function
Uniform density function is defined as a function with
constant PDF
A constant PDF within range from a to b implies constant
PDF of 1/(b-a) and linear CDF with slope of 1/(b-a)
PDF CDF
EEN 321 Digital Communications
Fall 2013
21
Gaussian random variable
A Gaussian random variable has PDF given as:
( )
2
2
2
2
1
) (

=
x
e x p
X
Multiple random variable Multiple random variable Multiple random variable Multiple random variable
EEN 321 Digital Communications
Fall 2013
23
Multiple random variables
Consider random variable X
1
and X
2
, then cumulative
distribution function is given as:
Probability density function is given as:
CDF can be given in terms of PDF as:
) , ( ) , (
2 2 1 1 2 1
2 1
x X x X P x x F
X X
=
) , ( ) , (
2 2 1 1
2
2
2 1
1
2 1
x X x X P x x p
x x
X X



=


=
1 2
2 1 2 1
1 2 2 1 2 1
) , ( ) , (
x x
X X X X
dy dy y y p x x F
EEN 321 Digital Communications
Fall 2013
24
Multiple random variables
Marginal density functions are given as:
Conditional cumulative probability distribution is given as:
1 2 1 2
2 2 1 1
) , ( ) (
) , ( ) (
2 1 2
2 1 1
dy x y p x p
dy y x p x p
X X X
X X X


=
=
) (
) , (
) | ( ) | (
2
2
2 2 1 1 2 1 |
2
1
2 1
2 1
x p
dy x y p
x X x X P x x F
X
x
X X
X X


=
= =
EEN 321 Digital Communications
Fall 2013
25
Multiple random variables
Conditional density function is given as:
This implies joint probability density function is
In case of independent random variables:
) (
) , (
) | (
2
2 1
2 1 |
2
2 1
2 1
x p
x x p
x x p
X
X X
X X
=
) | ( ) ( ) , (
2 1 | 2 2 1
2 1 2 2 1
x x p x p x x p
X X X X X
=
) ( ) ( ) , (
) ( ) ( ) , (
2 1 2 1
2 1 2 1
2 1 2 1
2 1 2 1
x F x F x x F
x p x p x x p
X X X X
X X X X
=
=
EEN 321 Digital Communications
Fall 2013
26
Function of random variable
Function of a random variable is also a random variable
Consider a random variable X such that:
CDF and PDF of random variable Y can be given in terms
of random variable X
0 > + = a b aX Y
( ) ( )
|

\
|

=
|

\
|

=
|

\
|

=
|

\
|

= + = =
a
b y
p
a a
b y
y
y F
y
y p
a
b y
F
a
b y
X P y b aX P y Y P y F
X Y Y
x Y
1
) ( ) (
) (
EEN 321 Digital Communications
Fall 2013
27
Moments
Mean or expected value of a continuous random variable is
given as:
It is also called 1
st
moment of a random variable X
It can also be defined for a discrete random variable as:
n
th
moment of a random variable is given as:


= = dx x xp X E m
X X
) ( ) (

= =
i
i i X
p x X E m ) (


= dx x p x X E
X
n n
) ( ) (

=
i
i
n n
p x X E
i
) (
EEN 321 Digital Communications
Fall 2013
28
Central moment
n
th
central moment is n
th
moment computed after
removing bias or mean value from the random variable
For continuous random variable, it can be given as:
Similarly, for discrete random variable:
Second central moment is known as variance of the signal:
( ) ( ) ( )


= dx x p m x m X E
X
n
x
n
x
) (
( ) ( ) ( )

=
i
i
n
x i
n
x
p m x m X E
( ) ( ) ( )
( ) ( )
2
2
2 2
2
2 2
2 2
2
2
2
x x x
x x x x x x x x x
m X E m m X E
dx p m dx xp m dx p x dx p m x m X E
= + =
+ = = =


EEN 321 Digital Communications


Fall 2013
29
Joint moments
(k,n)
th
joint moment of two random variables X
1
and X
2
is:
Joint central moment is defined as:
Covariance is defined as joined moment for k=n=1
( )


=
2 1 2 1 , 2 2 1
) , (
2 1
1
x dx x x p x x X X E
X X
k k n k
( ) ( ) ( )
( ) ( )



=
2 1 2 1 ,
2 1
2 1
) , (
2 1
2 1
2 1
x dx x x p
m x m x
m X m X E
X X
n
x
k
x n
x
k
x
( ) ( )( ) ( ) ( )
2 2 1
1 2 1 2 1 2 1
,
x x x x
m m X X E m X m X E X X Cov = =
EEN 321 Digital Communications
Fall 2013
30
Covariance
Covariance is a measure of how much two random
variables change together, i.e., effect of change in one
variable on the other
A positive covariance implies greater values of one variable
mainly corresponds to greater values of the other and
negative covariance implies greater values of one mainly
corresponding to lower values of the other
Variance is a special case of covariance, where both
variables are same
EEN 321 Digital Communications
Fall 2013
31
Covariance matrix
Consider a vector X consisting of n random variables and
vector m consisting of mean of each random variable
X = [X
1
X
2 ..
X
n
] m=[m
x1
m
x2
m
xn
]
Covariance matrix is a nxn matrix whose (i,j) elements
are given as:
It can be computed as E((X-m)
t
(X-m))
Diagonal entries of covariance matrix are variance for each
element of vector x
( )( ) ( )
j i
x j x i j i j i
m X m X E X X Cov = = ) , (
,
EEN 321 Digital Communications
Fall 2013
32
Jointly Gaussian random vector
Vector X consisting of n random variables is called jointly
Gaussian random vector, if its joint probability density
function is given as:
Where m is mean vector and M is two dimensional
covariance matrix
Gaussian distribution is a special case of jointly Gaussian
random vector with n = 1
( )
( ) ( )
( ) ( )
|

\
|
=

m x M m x
M
x p
t
n X
1
2 / 1 2 /
2
1
exp
det 2
1

You might also like