You are on page 1of 30

Modeling & Simulation

Lecture 3

Probability & Statistics
Review

Instructor:
Eng. Ghada Al-Mashaqbeh
The Hashemite University
Computer Engineering Department
The Hashemite University 2
Outline
Introduction.
Random variables and their
properties.
Some useful discrete and continuous
probability distributions.
Empirical distributions.


The Hashemite University 3

Probability and Statistics in
Simulation
Why do we need probability and statistics
in simulation?
Needed to understand how to model a
probabilistic system.
Needed to validate()the simulation model
Needed to determine / choose the input
probability distributions
Needed to generate random samples / values
from these distributions
Needed to analyze the output data / results
Needed to design correct / efficient simulation
experiments

The Hashemite University 4
Probability
and
statistics
Model a
probabilistic
system
Validate the
simulation
model
Choose the
input probabilistic
distribution
Generate random
samples from the
input distribution
Perform statistic
analysis of the
simulation output data
Design the
simulation
experiments
The Hashemite University 5
Random variables and their
properties
Experiment is a process whose outcome is not known
with certainty.
Sample space (S) is the set of all possible outcomes of
an experiment.
Sample points are the outcomes themselves.
Random variable (X, Y, Z, etc.) is a function that
assigns a real number to each point in the sample
space S.
Values generated by these random variables are x, y,
z, etc.
The Hashemite University 6
Random Variables I
Random Variable
A function that assigns a real number to each
point in a sample space
Example:
Let X be the value that results when a single die is
rolled
Possible values of X are 1, 2, 3, 4, 5, 6
Random variables types:
Discrete Random Variable: the random
variable takes a countable number of values.
Continuous Random Variables: the random
variable takes an uncountable number of
values, e.g. all values between 0 and 1.
The Hashemite University 7
Distribution (Cumulative)
Function
) ( x X P s
< < s = x x X P x F for ) ( ) (
} { x X s
: the probability associated with the event
Properties:

1.

2. F(x) is nondecreasing [i.e., if ].

3. .
. all for 1 ) ( 0 x x F s s
) ( ) ( then ,
2 1 2 1
x F x F x x s <
0 ) ( lim and 1 ) ( lim
x
= =

x F x F
x
The Hashemite University 8
Discrete Random Variables I
X is a discrete random variable if the number of
possible values of X is finite.
Ex: Flip a coin an arbitrary number of times. Let X
be the number of times the coin comes up heads
Probability Distribution
For each possible value, x
i
, for discrete random
variable X, there is a probability of occurrence,
P(X = x
i
) = p(x
i
)
p(x
i
) is the probability mass function (pmf) of X,
and it obeys the following rules:
1)p(x
i
) >= 0 for all i

2) 1 =

i all
i
x p ) (
The Hashemite University 9
Discrete Random Variables II
Probability mass function on I=[a,b]


Distribution function of discrete random variables is:



s s
= e
b x a
i
i
x p I X P ) ( ) (
+ < < =

s
x x p x F
x x
i
i
all for ) ( ) (
The Hashemite University 10
Examples
flipping a coin
S={H, T}
rolling a die
S={1,2,,6}
flipping two coins
S={(H,H), (H,T), (T,H), (T,T)}
X: the number of heads that occurs
rolling a pair of dice
S={(1,1), (1,2), , (6,6)}
X: the sum of the two dice

Find the pmf and the cdf of the previous two random variables!! (On Board).
Hint:
The cdf is continuous from the right.

The Hashemite University 11
Continuous Random Variables I
X is a continuous random variable if its range
space R
x
is an interval or a collection of intervals.
The probability that X lies in the interval [a,b] is
given by:


f(x), denoted as the pdf of X, satisfies:




Properties

X
R
X
R x x f
dx x f
R x x f
X
in not is if , 0 ) ( 3.
1 ) ( 2.
in all for , 0 ) ( 1.
=
=
>
}
}
= s s
b
a
dx x f b X a P ) ( ) (
) ( ) ( ) ( ) ( . 2
0 ) ( because , 0 ) ( 1.
0
0
0
b X a P b X a P b X a P b X a P
dx x f x X P
x
x
= s = s = s s
= = =
}
The Hashemite University 12
Continuous Random Variables II
Example: Life of an inspection device is given by
X, a continuous random variable with pdf:






X has an exponential distribution with mean 2 years
Probability that the devices life is between 2 and 3 years
is:

>
=

otherwise , 0
0 x ,
2
1
) (
2 / x
e
x f
145 . 0
2
1
) 3 2 (
3
2
2 /
= = s s
}

dx e x P
x
The Hashemite University 13
Continuous Random Variables III
For continuous random variables:



With some technical assumptions we have:


All probability questions about X can be
answered in terms of the cdf, e.g.:

}

= s =
x
dt t f x X P x F ) ( ) ( ) (
b a a F b F b X a P all for , ) ( ) ( ) ( = s
) ( ' ) ( x F x f =
The Hashemite University 14
Example
Example: An inspection device has a cdf:



The probability that the device lasts for less than 2
years:


The probability that it lasts between 2 and 3 years:
2 /
0
2 /
1
2
1
) (
x
x
t
e dt e x F

= =
}
632 . 0 1 ) 2 ( ) 0 ( ) 2 ( ) 2 0 (
1
= = = = s s

e F F F X P
145 . 0 ) 1 ( ) 1 ( ) 2 ( ) 3 ( ) 3 2 (
1 ) 2 / 3 (
= = = s s

e e F F X P
The Hashemite University 15
Joint Distributions I
Most of the time in simulation we are dealing
with multiple random variables at the same
time (not only one).
For example, in the single server queuing
system that we have dealt with in Chapter 1
we have the following random variables:
The input service time.
The input interarrival time.
The output queuing delay for customers.
But how to deal with multiple random
variables????
B
The Hashemite University 16
Joint Distributions II
Let X and Y each have a discrete distribution.
Then X and Y have a discrete joint
distribution if there exists a function p(x,y)
such that:
p(x,y) = P[X=x and Y=y]
Random variables X and Y are jointly
continuous if there exists a non-negative
function f(x,y) called the joint probability
density function of X and Y, such that for all
sets of real numbers A and B
P(X A, Y B) = f(x,y)dxdy
A B
The Hashemite University 17
Joint Distributions III
Two random variables are independent if
knowing the value of one variable tells us nothing
about the value of the other random variable.
X and Y are independent if the following is
satisfied:
For discrete random variables:
functions mass y probabilit marginal called and
) , ( ) ( and , ) , ( ) (
y x, all for ) ( ) ( ) , (
_ _

= =
=
x all
Y
y all
X
Y X
y x p y p y x p x p
where
y p x p y x p
The Hashemite University 18
Joint Distributions IV
For continuous random variables:
functions density y probabilit marginal called and
) , ( ) ( and , ) , ( ) (
y x, all for ) ( ) ( ) , (
} }
+

+

= =
=
dx y x f y f dy y x f x f
where
y f x f y x f
Y X
Y X
The Hashemite University 19
Examples
X and Y are discrete joint random variables, see
if they are independent:




X and Y are continuous joint random variables,
see if they are independent:

= =
=
otherwise
y x
xy
y x p
0
4 , 3 , 2 and 2 , 1 ,
27
) , (

s + > >
=
otherwise
y x y x xy
y x f
0
1 and , 0 , 0 , 24
) , (
The Hashemite University 20
Reminder
How to prove that a rule is
true???????
And how to prove that a rule is
wrong or false?????
The Hashemite University 21
Random Variables
Characteristics (Parameters) I
The expected value of X (mean) is denoted by E(X)
If X is discrete

If X is continuous

A measure of the central tendency
The variance of X is denoted by V(X) or var(X) or
o
2
Definition: V(X) = E[(X E[X])
2
]
Also, V(X) = E(X
2
) [E(x)]
2
A measure of the spread or variation of the possible
values of X around the mean
The standard deviation of X is denoted by o
Definition: square root of V(X)
Expressed in the same units as the mean

=
i
i i
x p x x E
all
) ( ) (
}


= dx x xf x E ) ( ) (
The Hashemite University 22
Density functions for continuous random variables
with large and small variances (Source LK00, Fig 4.6)

2

large

2

small
X X
X X
Small vs. Large Variance
The Hashemite University 23
Random Variables
Characteristics (Parameters) II
Alternative measures of central
tendency:
Mode: is the value of the random variable
that maximizes the probability distribution
(or mass) function.
For some probability distributions such quantity
could be not unique (e.g. the uniform distribution).
Median: is the value of the random variable
that is located in the center. In terms of
probability, it is the smallest value that has
F(x
i
) >= 0.5 (for continuous random
variables F(x
i
) = 0.5).
It could be better than the mean since it is
insensitive to strange values (very high or very
low).
The Hashemite University 24
Random Variables
Characteristics (Parameters) III
Self reading:
Revise the main properties of the mean
and the variance.
The Hashemite University 25
Example
Example: The mean of life of the previous
inspection device is:


To compute variance of X, we first compute
E(X
2
):


Hence, the variance and standard deviation of
the devices life are:

2
2 /
2
1
) (
0
2 /
0
0
2 /
= +

= =
}


dx e
x
dx xe X E
x x
xe
8
2 /
2
2
1
) (
0
2 /
0
0
2 / 2 2
= +

= =
}


dx e
x
dx e x X E
x x
e x
2 ) (
4 2 8 ) (
2
= =
= =
X V
X V
o
The Hashemite University 26
Covariance I
What about dependent random variables?
Now we will consider how to measure the amount of
dependence between random variables.
The linear dependence between two random variables
is measured using the covariance.
The covariance between the random variables X and
Y, denoted by Cov(X, Y), is defined by
Cov(X, Y) = E{[X - E(X)][Y - E(Y)]}
= E(XY) - E(X)E(Y)
Note that Cov(X, X) = V(X).

The Hashemite University 27
Covariance II
Cov(X, Y) X and Y are
= 0 uncorrelated
> 0 positively correlated
< 0 negatively correlated


-- Independent random variables are also
uncorrelated.
-- But what is correlation? It means that
values less and greater than the mean of
each random variables tend to occur at the
same time.


The Hashemite University 28
Correlation Factor
Corr(X, Y) X and Y are
= 0 uncorrelated
= 1 positively correlated
= -1 negatively correlated
-- A dimensionless quantity to measure
correlation is Cor(X,Y):


-- its values between -1 and 1, and it is
a measure of linear dependence
) ( ) (
) , (
) , (
Y Var X Var
Y X Cov
Y X Cor =
The Hashemite University 29
Example
Compute the covariance and the correlation
factor for the joint continues random variables
X and Y:



s + > >
=
otherwise
y x y x xy
y x f
0
1 and , 0 , 0 , 24
) , (
The Hashemite University 30
Additional Notes
The lecture covers the following
sections from the textbook:
Chapter 4:
Sections 4.1 and 4.2

You might also like