You are on page 1of 71

0

0.1
0.2
0.3
0.4
0 5 10 15
1
b a

a
b
( ) f x
x
0
0.1
0.2
0.3
0.4
0 5 10 15
1
b a

a
b
( ) f x
x
1
b a

a
b
( ) f x
x
Continuous Distributions
The Uniform distribution from a to b
( )
1

0 otherwise
a x b
f x
b a

s s

The Normal distribution


(mean , standard deviation o)
( )
( )
2
2
2
1

2
x
f x e

o
to

=
0
0.1
0.2
-2 0 2 4 6 8 10
The Exponential distribution
( )
0
0 0
x
e x
f x
x

>
=

<

Weibull distribution with parameters o and |.


( )
Thus 1
x
F x e
|
o
|

=
( ) ( )
1
and 0
x
f x F x x e x
|
o
| |
o

'
= = >
The Weibull density, f(x)
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0 1 2 3 4 5
(o = 0.5, | = 2)
(o = 0.7, | = 2)
(o = 0.9, | = 2)
The Gamma distribution
Let the continuous random variable X have
density function:
( ) ( )
1
0
0 0
x
x e x
f x
x
o
o

>

I =

<

Then X is said to have a Gamma distribution


with parameters o and .
Expectation
Let X denote a discrete random variable with
probability function p(x) (probability density function
f(x) if X is continuous) then the expected value of X,
E(X) is defined to be:
( ) ( ) ( )
i i
x i
E X xp x x p x = =

( ) ( )
E X xf x dx

=
}
and if X is continuous with probability density function
f(x)
0
0.1
0.2
0.3
0.4
0 1 2 3 4 5 6 7
Interpretation of E(X)
1. The expected value of X, E(X), is the centre of
gravity of the probability distribution of X.
2. The expected value of X, E(X), is the long-run
average value of X. (shown later Law of Large
Numbers)
E(X)
Example:
The Uniform distribution
Suppose X has a uniform distribution from a to b.
Then:
( )
1
0 ,
b a
a x b
f x
x a x b

s s

=

< >

The expected value of X is:


( ) ( )
1
b
b a
a
E X xf x dx x dx

= =
} }
( )
2 2 2
1
2 2 2
b
b a
a
x b a a b
b a

(
+
= = =
(


Example:
The Normal distribution
Suppose X has a Normal distribution with parameters
and o.
Then:
( )
( )
2
2
2
1
2
x
f x e

o
to

=
The expected value of X is:
( ) ( )
( )
2
2
2
1
2
x
E X xf x dx x e dx

o
to


= =
} }
x
z

o

=
Make the substitution:
1
and dz dx x z o
o
= = +
Hence
Now
( ) ( )
2
2
1
2
z
E X z e dz o
t

= +
}
2 2
2 2
1
2 2
z z
e dz ze dz
o

t t



= +
} }
2 2
2 2
1
1 and 0
2
z z
e dz ze dz
t



= =
} }
( )
Thus E X =
Example:
The Gamma distribution
Suppose X has a Gamma distribution with parameters o
and .
Then:
( ) ( )
1
0
0 0
x
x e x
f x
x
o
o

>

I =

<

Note:
This is a very useful formula when working with the
Gamma distribution.
( )
( )
1
0
1 if 0,
x
f x dx x e dx
o
o

o
o

= = > > 0.
I
} }
The expected value of X is:
( ) ( )
( )
1
0
x
E X xf x dx x x e dx
o
o

= =
I
} }
This is now
equal to 1.
( )
0
x
x e dx
o
o

=
I
}
( )
( )
( )
1
1
0
1
1
x
x e dx
o o
o
o
o

o o

+
I +
=
I I +
}
( )
( )
( )
( )
1 o o o
o
o o
I + I
= = =
I I
Thus if X has a Gamma (o ,) distribution then the
expected value of X is:
( )
E X
o

=
Special Cases: (o ,) distribution then the expected
value of X is:
1. Exponential () distribution: o = 1, arbitrary
( )
1
E X

=
2. Chi-square (v) distribution: o =
v
/
2
, = .
( )
2
1
2
E X
v
v = =
The Gamma distribution
0
0.05
0.1
0.15
0.2
0.25
0.3
0 2 4 6 8 10
( )
E X
o

=
0
0.05
0.1
0.15
0.2
0.25
0 5 10 15 20 25
The Exponential distribution
( )
1
E X

=
0
0.02
0.04
0.06
0.08
0.1
0.12
0.14
0 5 10 15 20 25
The Chi-square (_
2
) distribution
( )
E X v =
Expectation of functions of
Random Variables
Definition
Let X denote a discrete random variable with probability
function p(x) (probability density function f(x) if X is
continuous) then the expected value of g(X), E[g(X)] is
defined to be:
( ) ( ) ( ) ( ) ( )
i i
x i
E g X g x p x g x p x ( = =


( ) ( ) ( )
E g X g x f x dx

( =
}
and if X is continuous with probability density function
f(x)
Example:
The Uniform distribution
Suppose X has a uniform distribution from 0 to b.
Then:
( )
1
0
0 0,
b
x b
f x
x x b
s s

=

< >

Find the expected value of A = X


2
. If X is the length of
a side of a square (chosen at random form 0 to b) then A
is the area of the square
( )
( )
2 2 2
1
b
b a
a
E X x f x dx x dx

= =
} }
( )
3 3 3 2
1
0
0
3 3 3
b
b
x b b
b
(

= = =
(

= 1/3 the maximum area of the square
Example:
The Geometric distribution
Suppose X (discrete) has a geometric distribution with
parameter p.
Then:
( ) ( )
1
1 for 1, 2, 3,
x
p x p p x

= =
Find the expected value of X A and the expected value
of X
2
.
( )
( ) ( ) ( ) ( )
2 3
2 2 2 2 2 2
1
1 2 1 3 1 4 1
x
E X x p x p p p p p p p

=
= = + + + +

( ) ( ) ( ) ( ) ( )
2 3
1
1 2 1 3 1 4 1
x
E X xp x p p p p p p p

=
= = + + + +

Recall: The sum of a geometric Series


Differentiating both sides with respect to r we get:
2 3
1
a
a ar ar ar
r
+ + + + =

2 3
1
or with 1, 1
1
a r r r
r
= + + + + =

( ) ( )
( )
2
2 3
2
1
1 2 3 4 1 1 1
1
r r r r
r

+ + + = =

Differentiating both sides with respect to r we get:


Thus
This formula could also be developed by noting:
( )
2 3
2
1
1 2 3 4
1
r r r
r
+ + + =

2 3
1 2 3 4 r r r + + + =
2 3
1 r r r + + + +
2 3
r r r + + +
2 3
r r + +
3
r +
2 3
1
1 1 1 1
r r r
r r r r
= + + +

( )
( )
2 3
2
1 1 1 1
1
1 1 1
1
r r r
r r r
r
= + + + = =


This formula can be used to calculate:
( ) ( ) ( ) ( ) ( )
2 3
1
1 2 1 3 1 4 1
x
E X xp x p p p p p p p

=
= = + + + +

( )
2 3
1 2 3 4 where 1 p r r r r p = + + + + =
( )
2 3
1 2 3 4 where 1 p r r r r p = + + + + =
2
2
1 1 1
= =
1
p p
r p p
| |
| |
=
| |

\ .
\ .
To compute the expected value of X
2
.
( )
( ) ( ) ( ) ( )
2 3
2 2 2 2 2 2
1
1 2 1 3 1 4 1
x
E X x p x p p p p p p p

=
= = + + + +

( )
2 2 2 2 2 3
1 2 3 4 p r r r = + + + +
we need to find a formula for
2 2 2 2 2 3 2 1
1
1 2 3 4
x
x
r r r x r

=
+ + + + =

Note
( )
2 3
1
0
1
1
1
x
x
r r r r S r
r

=
+ + + + = = =

Differentiating with respect to r we get


( )
( )
1 2 3 1 1
1
2
0 1
1
1 2 3 4
1
x x
x x
S r r r r xr xr
r


= =
'
= + + + + = = =


Differentiating again with respect to r we get
( ) ( ) ( ) ( )
2 3
1
2 3 2 4 3 5 4 S r r r r
''
= + + +
( ) ( ) ( )( ) ( )
( )
3
2 2
3
1 2
2
1 1 2 1 1
1
x x
x x
x x r x x r r
r


= =
= = = =


( )
( )
2
3
2
2
1
1
x
x
x x r
r

=
=

Thus
( )
2 2 2
3
2 2
2
1
x x
x x
x r xr
r


= =
=


( )
2 2 2
3
2 2
2
1
x x
x x
x r xr
r


= =
= +


implies
( )
2 2 2
3
2 2
2
1
x x
x x
x r xr
r


= =
= +


( )
2 2 2 2 2 3 2
3
2
2 3 4 5 2 3 4
1
r r r r r
r
+ + + + = + + + +

Thus
( )
2 2 2 2 3 2
3
2
1 2 3 4 1 2 3 4
1
r r r r r r
r
(
+ + + + = + + + + + (
(

( )
2 3
3
2
1 2 3 4
1
r
r r r
r
= + + + + +

( ) ( ) ( ) ( )
3 2 3 3
2 1 2 1 1
1 1 1 1
r r r r
r r r r
+ +
= + = =

3
2
if 1
p
r p
p

= =
Thus
( )
2 2 2 2 2 3
1 2 3 4 E X p r r r
(
= + + + +

2
2 p
p

=
Moments of Random Variables
Definition
Let X be a random variable (discrete or
continuous), then the k
th
moment of X is defined
to be:
( )
k
k
E X =
( )
( )
-
if is discrete
if is continuous
k
x
k
x p x X
x f x dx X

}
The first moment of X , =
1
= E(X) is the center of gravity of
the distribution of X. The higher moments give different
information regarding the distribution of X.
Definition
Let X be a random variable (discrete or
continuous), then the k
th
central moment of X is
defined to be:
( )
0
k
k
E X
(
=

( ) ( )
( ) ( )
-
if is discrete
if is continuous
k
x
k
x p x X
x f x dx X

}
where =
1
= E(X) = the first moment of X .
The central moments describe how the
probability distribution is distributed about the
centre of gravity, .
| |
0
1
E X =
and is denoted by the symbol var(X).
= 2
nd
central moment.
( )
2
0
2
E X
(
=

depends on the spread of the probability distribution
of X about .
( )
2
0
2
E X
(
=

is called the variance of X.
( )
2
0
2
E X
(
=

is called the standard
deviation of X and is denoted by the symbol o.
( ) ( )
2
0 2
2
var X E X o
(
= = =

The third central moment
contains information about the skewness of a
distribution.
( )
3
0
3
E X
(
=

The third central moment
contains information about the skewness of a
distribution.
( )
3
0
3
E X
(
=

Measure of skewness
( )
0 0
3 3
1
3 2 3
0
2

o

= =
0
0.01
0.02
0.03
0.04
0.05
0.06
0.07
0.08
0 5 10 15 20 25 30 35
0
3
0, 0
1
> >
Positively skewed distribution
0
0.01
0.02
0.03
0.04
0.05
0.06
0.07
0.08
0 5 10 15 20 25 30 35
0
3 1
0, 0 < <
Negatively skewed distribution
0
0.01
0.02
0.03
0.04
0.05
0.06
0.07
0.08
0.09
0 5 10 15 20 25 30 35
0
3 1
0, 0 = =
Symmetric distribution
The fourth central moment
Also contains information about the shape of a
distribution. The property of shape that is measured by
the fourth central moment is called kurtosis
( )
4
0
4
E X
(
=

The measure of kurtosis
( )
0 0
4 4
2
2 4
0
2
3 3

o

= =
0
0.01
0.02
0.03
0.04
0.05
0.06
0.07
0.08
0.09
0 5 10 15 20 25 30 35
0
4
0, moderate in size
2
=
Mesokurtic distribution
0
0 20 40 60 80
0
4
0, small in size
2
<
Platykurtic distribution
0
0 20 40 60 80
0
4
0, large in size
2
>
leptokurtic distribution
Example: The uniform distribution from 0 to 1
( )
1 0 1
0 0, 1
x
f x
x x
s s

=

< >

( )
1
1
1
0
0
1
1
1 1
k
k k
k
x
x f x dx x dx
k k

(
= = = =
(
+ +

} }
Finding the moments
Finding the central moments:
( ) ( ) ( )
1
0
1
2
0
1
k k
k
x f x dx x dx

= =
} }
1
2
making the substitution w x =
( ) ( )
1
1
2
2
1
1
2
2
1 1
1 1 1
2 2 0
1 1
k k
k
k
k
w
w dw
k k

+ +
+



(
= = =
(
+ +

}
( )
( )
( )
1
1
1
if even
1 1
2 1
2 1
0 if odd
k
k
k
k
k
k
k
+
+



+
= =

+

Thus
( ) ( )
0 0 0
2 3 4
2 4
1 1 1 1
Hence , 0,
2 3 12 2 5 80
= = = = =
( )
0
2
1
var
12
X = =
( )
0
2
1
var
12
X o = = =
0
3
1
3
0

o
= =
The standard deviation
The measure of skewness
( )
0
4
1
2 4
1 80
3 3 1.2
1 12

o
= = =
The measure of kurtosis
Rules for expectation
Rules:
( )
( ) ( )
( ) ( )
if is discrete
if is continuous
x
g x p x X
E g X
g x f x dx X

( =

}
| |
1. where is a constant E c c c =
( ) ( ) | | ( )
if then g X c E g X E c cf x dx

( = =
}
( )
c f x dx c

= =
}
Proof
The proof for discrete random variables is similar.
| | | |
2. where , are constants E aX b aE X b a b + = +
( ) | | ( ) ( )
if then g X aX b E aX b ax b f x dx

+ + = +
}
Proof
The proof for discrete random variables is similar.
( ) ( )
a xf x dx b f x dx


= +
} }
( )
aE X b = +
( ) ( )
2
0
2
3. var X E X
(
= =

( )
( )
2 2
2 x x f x dx

= +
}
Proof
The proof for discrete random variables is similar.
( )
( )
2
2 2
2 1
E X E X ( = =

( ) ( ) ( ) ( )
2 2
var X E X x f x dx

(
= =

}
( ) ( ) ( )
2 2
2 x f x dx xf x dx f x dx


= +
} } }
( )
( )
2 2
2 2 1
2 E X E X
2 2
( = + = =

( ) ( )
2
4. var var aX b a X + =
Proof
( ) ( )
2
var
aX b
aX b E aX b
+
(
+ = +

| | ( )
2
E aX b a b
(
= + +
(

( )
2
2
E a X
(
=

( ) ( )
2
2 2
var a E X a X
(
= =

| | | |
aX b
E aX b aE X b a b
+
= + = + = +
Moment generating functions
Definition
( )
( ) ( )
( ) ( )
if is discrete
if is continuous
x
g x p x X
E g X
g x f x dx X

( =

}
Let X denote a random variable, Then the moment
generating function of X , m
X
(t) is defined by:
Recall
( )
( )
( )
if is discrete
if is continuous
tx
x
tX
X
tx
e p x X
m t E e
e f x dx X

(
= =

}
Examples
( ) ( )
1 0,1, 2, ,
n x
x
n
p x p p x n
x

| |
= =
|
\ .
The moment generating function of X , m
X
(t) is:
1. The Binomial distribution (parameters p, n)
( ) ( )
tX tx
X
x
m t E e e p x
(
= =


( )
0
1
n
n x
tx x
x
n
e p p
x

=
| |
=
|
\ .

( )
( )
0 0
1
n n
x
n x
t x n x
x x
n n
e p p a b
x x


= =
| | | |
= =
| |
\ . \ .

( )
( )
1
n
n
t
a b e p p = + = +
( )
0,1, 2,
!
x
p x e x
x


= =
The moment generating function of X , m
X
(t) is:
2. The Poisson distribution (parameter )
( ) ( )
tX tx
X
x
m t E e e p x
(
= =


0
!
x
n
tx
x
e e
x


=
=

( )
0 0
using
! !
t
x
t
x
e u
x x
e
u
e e e e
x x



= =
= = =

( )
1
t
e
e

=
( )
0
0 0
x
e x
f x
x

>
=

<

The moment generating function of X , m


X
(t) is:
3. The Exponential distribution (parameter )
( ) ( )
0
tX tx tx x
X
m t E e e f x dx e e dx

(
= = =
} }
( )
( )
0
0
t x
t x
e
e dx
t

(
= =
(


}
undefined
t
t
t

<

>

( )
2
2
1
2
x
f x e
t

=
The moment generating function of X , m
X
(t) is:
4. The Standard Normal distribution ( = 0, o = 1)
( ) ( )
tX tx
X
m t E e e f x dx

(
= =
}
2
2
2
1
2
x tx
e dx
t

=
}
2
2
1
2
x
tx
e e dx
t

=
}
( )
2 2 2 2
2 2
2 2 2
1 1
2 2
x tx t x tx t
X
m t e dx e e dx
t t
+



= =
} }
We will now use the fact that
( )
2
2
2
1
1 for all 0,
2
x b
a
e dx a b
a t

= >
}
We have
completed
the square
( )
2
2 2
2 2 2
1
2
x t
t t
e e dx e
t

= =
}
This is 1

( ) ( )
1
0
0 0
x
x e x
f x
x
o
o

>

I =

<

The moment generating function of X , m


X
(t) is:
4. The Gamma distribution (parameters o, )
( ) ( )
tX tx
X
m t E e e f x dx

(
= =
}
( )
1
0
tx x
e x e dx
o
o


=
I
}
( )
( ) 1
0
t x
x e dx
o

=
I
}
We use the fact
( )
1
0
1 for all 0, 0
a
a bx
b
x e dx a b
a


= > >
I
}
( )
( )
( ) 1
0
t x
X
m t x e dx
o

=
I
}
( )
( )
( )
( ) 1
0
t x
t
x e dx
t
t
o
o
o

o
o


o

| |
= =
|
I
\ .
}
Equal to 1

Properties of
Moment Generating Functions
1. m
X
(0) = 1
( )
( )
( )
( )
( )
0
, hence 0 1 1
tX X
X X
m t E e m E e E

= = = =
( )
v) Gamma Dist'n
X
m t
t
o

| |
=
|

\ .
( )
2
2
iv) Std Normal Dist'n
t
X
m t e =
( )
iii) Exponential Dist'n
X
m t
t

| |
=
|

\ .
( )
( )
1
ii) Poisson Dist'n
t
e
X
m t e

=
( )
( )
i) Binomial Dist'n 1
n
t
X
m t e p p = +
Note: the moment generating functions of the following
distributions satisfy the property m
X
(0) = 1
( )
2 3
3 2
1
2. 1
2! 3! !
k
k
X
m t t t t t
k

= + + + + + +
We use the expansion of the exponential function:
2 3
1
2! 3! !
k
u
u u u
e u
k
= + + + + + +
( )
( )
tX
X
m t E e =
2 3
2 3
1
2! 3! !
k
k
t t t
E tX X X X
k
| |
= + + + + + +
|
\ .
( )
( ) ( ) ( )
2 3
2 3
1
2! 3! !
k
k
t t t
tE X E X E X E X
k
= + + + + + +
2 3
1 2 3
1
2! 3! !
k
k
t t t
t
k
= + + + + + +
( )
( ) ( )
0
3. 0
k
k
X X k
k
t
d
m m t
dt

=
= =
Now
( )
2 3
3 2
1
1
2! 3! !
k
k
X
m t t t t t
k

= + + + + + +
( )
2 1
3 2
1
2 3
2! 3! !
k
k
X
m t t t kt
k


'
= + + + + +
( )
2 1
3
1 2
2! 1 !
k
k
t t t
k



= + + + + +

( )
1
and 0
X
m
'
=
( )
( )
2
4
2 3
2! 2 !
k
k
X
m t t t t
k



''
= + + + + +

( )
2
and 0
X
m
''
=
( )
( )
continuing we find 0
k
X k
m =
( )
( )
i) Binomial Dist'n 1
n
t
X
m t e p p = +
Property 3 is very useful in determining the moments of a
random variable X.
Examples
( )
( ) ( )
1
1
n
t t
X
m t n e p p pe

'
= +
( )
( ) ( )
1
0 0
1
0 1
n
X
m n e p p pe np

'
= + = = =
( ) ( )
( ) ( ) ( )
2 1
1 1 1
n n
t t t t t
X
m t np n e p p e p e e p p e

(
''
= + + +
(

( )
( )
( ) ( )
( )
2
2
1 1 1
1 1
n
t t t t
n
t t t
npe e p p n e p e p p
npe e p p ne p p

(
= + + +

(
= + +

| | | |
2 2
2
1 np np p np np q n p npq = + = + = + =
( )
( )
1
ii) Poisson Dist'n
t
e
X
m t e

=
( )
( ) ( )
1 1
t t
e e t
t
X
m t e e e


+
' (
= =

( )
( ) ( ) ( )
1 1 2 1
2
1
t t t
e t e t e t
t
X
m t e e e e


+ + +
'' (
= + = +

( )
( ) ( )
1 2 1
2
2 1
t t
e t e t
t t
X
m t e e e e


+ +
''' ( (
= + + +

( ) ( )
1 2 1
2
3
t t
e t e t
t
e e e


+ +
(
= + +

( ) ( ) ( )
1 3 1 2 1
3 2
3
t t t
e t e t e t
e e e


+ + +
= + +
( )
( )
0
1 0
1
0
e
X
m e


+
'
= = =
( )
( ) ( )
0 0
1 0 1 0
2 2
2
0
e e
X
m e e


+ +
''
= = + = +
( )
3 0 2 0 0 3 2
3
0 3 3
t
X
m e e e
'''
= = + + = + +
To find the moments we set t = 0.
( )
iii) Exponential Dist'n
X
m t
t

| |
=
|

\ .
( )
( )
1
X
d t
d
m t
dt t dt

| |
'
= =
|

\ .
( )( ) ( ) ( )
2 2
1 1 t t

= =
( ) ( )( ) ( ) ( )
3 3
2 1 2
X
m t t t

''
= =
( ) ( )( ) ( ) ( ) ( )
4 4
2 3 1 2 3
X
m t t t

'''
= =
( )
( ) ( ) ( )( ) ( ) ( ) ( )
5 5
4
2 3 4 1 4!
X
m t t t

= =
( )
( ) ( ) ( )
1
!
k
k
X
m t k t

=
Thus
( ) ( )
2
1
1
0
X
m

'
= = = =
( ) ( )
3
2
2
2
0 2
X
m

''
= = =
( )
( ) ( ) ( )
1 !
0 !
k
k
k X
k
k
m k


= = =
( )
2 3
3 2
1
1
2! 3! !
k
k
X
m t t t t t
k

= + + + + + +
The moments for the exponential distribution can be calculated in an
alternative way. This is note by expanding m
X
(t) in powers of t and
equating the coefficients of t
k
to the coefficients in:
( )
2 3
1 1
1
1
1
X
m t u u u
t
t u


= = = = + + + +

2 3
2 3
1
t t t

= + + + +
Equating the coefficients of t
k
we get:
1 !
or
!
k
k
k k
k
k



= =
( )
2
2
t
X
m t e =
The moments for the standard normal distribution
We use the expansion of e
u
.
2 3
0
1
! 2! 3! !
k k
u
k
u u u u
e u
k k

=
= = + + + + + +

( )
( )
( ) ( ) ( )
2 2 2
2
2
2
2 3
2 2 2
2
1
2! 3! !
t
k
t t t
t
X
m t e
k
= = + + + + + +
2 4 6 2
1
2
2 3
1 1 1
1
2 2! 2 3! 2 !
k
k
t t t t
k
= + + + + + +
We now equate the coefficients t
k
in:
( )
( )
2 2
2 2
1
1
2! ! 2 !
k k
k k
X
m t t t t t
k k

= + + + + + + +
If k is odd:
k
= 0.
( )
2
1
2 ! 2 !
k
k
k k

=
For even 2k:
( )
2
2 !
or
2 !
k
k
k
k
=
( )
1 2 3 4
2
2! 4!
Thus 0, 1, 0, 3
2 2 2!
= = = = = =

You might also like