You are on page 1of 122

Functions of Random Variables

Methods for determining the distribution of


functions of Random Variables
1. Distribution function method
2. Moment generating function method
3. Transformation method

Distribution function method


Let X, Y, Z . have joint density f(x,y,z, )
Let W = h( X, Y, Z, )
First step
Find the distribution function of W
G(w) = P[W w] = P[h( X, Y, Z, ) w]
Second step
Find the density function of W
g(w) = G'(w).

Example 1
Let X have a normal distribution with mean 0, and
variance 1. (standard normal distribution)

1
f x
e
2
Let W = X2.
Find the distribution of W.

x2

First step
Find the distribution function of W
G(w) = P[W w] = P[ X2 w]

P w X w
w

F
where

1
e
2

x2

if w 0

dx

w F w

1
F x f x
e
2

x2

Second step
Find the density function of W
g(w) = G'(w).

d w
d w
w
F w
dw
dw

1 12
1 12
w w f w w
2
2
w
2

1
2

1
1
1

e
w
e
2
2
2
1
w

w 2 e 2 if w 0.
2

w
2

1
w
2

1
2

Thus if X has a standard Normal distribution then


W = X2
has density
1
w

2
2

1
g w
w e
2

if w 0.

This distribution is the Gamma distribution with =


and = .
This distribution is also the 2 distribution with = 1
degree of freedom.

Example 2
Suppose that X and Y are independent random
variables each having an exponential distribution
with parameter (mean 1/)

f1 x e x for x 0

f 2 y e y for y 0
f x, y f1 x f 2 y
2 x y

Let W = X + Y.
Find the distribution of W.

for x 0, y 0

First step
Find the distribution function of W = X + Y
G(w) = P[W w] = P[ X + Y w]

w w x

P X Y w
0

f x f y dydx
1

0
w w x

2 x y

dydx

w w x

P X Y w

f x f y dydx
1

w w x

2 x y

dydx

e
w

e
0

w x

dy dx

w x

dx
0

w x
0

e
2
x
e
dx

w x
0

e
2
x
P X Y w e
dx

e x e w dx
0

e
w

xe

ew
w

we


1 e w we w

e 0

Second step
Find the density function of W
g(w) = G'(w).

d
1 e w we w

dw


dw
de
w
w
e
e w

dw
dw

w
w
2
w

e e we

2 we w

for w 0

Hence if X and Y are independent random variables


each having an exponential distribution with parameter
then W has density

g w 2 we w

for w 0

This distribution can be recognized to be the Gamma


distribution with parameters = 2 and

Example: Students t distribution


Let Z and U be two independent random
variables with:
1. Z having a Standard Normal distribution
and
2. U having a 2 distribution with degrees
of freedom
Find the distribution of

Z
t
U

The density of Z is:

1
f z
e
2
The density of U is:

z2

h u
u2 e 2


2

Therefore the joint density of Z and U is:

f z, u f z h u

1

2


2
2

z 2 u
1
2
2

The distribution function of T is:

Z
t

G t PT t P
t P Z
U
U

Therefore:

t
G t P T t P Z

1

2


2
2

z 2 u
1
2
2

dzdu

Illustration of limits
t>0

t>0
U

Now:
t

G (t )
0

1

2


2
2

and:

d
g t G (t )
dt
0

z 2 u
1
2
2

1

2


2
2

dzdu

z u
1

2
2
u e
dz du

Using:

d
d
F ( x, t )dx F ( x, t )dx

dt a
dt
a

Using the fundamental theorem of calculus:


x

F ( x) f t dt

If

then

d
g t
dt
0

1

2


2
2

1

2

then


2
2

u
0

F ( x) f x

z u
1

2
2
u e
dz du

t 2u

1
2
2
2

e e

u
du

Hence

1

2

g (t )

1 x
1
x e dx

0

or


2
2

Using

x
0

1 x

dx

t2
1
u

du

Hence

t2
1

1
2

du

t2
1

and

2
2

1
2

1
2

1
1
1

t2 2
2
2

g (t )
1

2
2

or

g (t )


2
where

t2
1

K


2

t2
K
1

Students t distribution
t2
g (t ) K
1

where

K


2

Student W.W. Gosset


Worked for a distillery
Not allowed to publish
Published under the
pseudonym Student

t distribution
standard normal distribution

Distribution of the Max and Min


Statistics

Let x1, x2, , xn denote a sample of size n from


the density f(x).
Let M = max(xi) then determine the distribution
of M.
Repeat this computation for m = min(xi)
Assume that the density is the uniform density
from 0 to .

Hence

f ( x)

0 x
elsewhere

and the distribution function

0
x

F ( x) P X x

1

x0
0 x
x

Finding the distribution function of M.

G (t ) P M t P max xi t
P x1 t ,L , xn t
P x1 t L P xn t

t0

t


1

0 t
t

Differentiating we find the density function of M.

nt n 1

g t G t n
0

f(x)

0 t
otherwise
g(t)

Finding the distribution function of m.

G (t ) P m t P min xi t
1 P x1 t ,L , xn t
1 P x1 t L P xn t

t0

t
1 1

0 t
t

Differentiating we find the density function of m.


n 1
n
t
0 t
1
g t G t

0
otherwise

f(x)

g(t)

The probability integral transformation


This transformation allows one to convert
observations that come from a uniform
distribution from 0 to 1 to observations that
come from an arbitrary distribution.
Let U denote an observation having a uniform
distribution from 0 to 1.

1 0 u 1
g (u )
elsewhere

Let f(x) denote an arbitrary density function and


F(x) its corresponding cumulative distribution
function.
1
X

F
(U )
Let
Find the distribution of X.
1

G x P X x P F (U ) x

P U F x
F x

Hence.

g x G x F x f x

Thus if U has a uniform distribution from 0 to 1.


Then
1

X F (U )
has density f(x).

X F 1 (U )

The Transformation Method


Theorem
Let X denote a random variable with
probability density function f(x) and U =
h(X).
Assume that h(x) is either strictly increasing
(or decreasing) then the probability density of
U is:
1
dh (u )
dx
1
g u f h (u )
f x
du
du

Proof
Use the distribution function method.
Step 1 Find the distribution function, G(u)
Step 2 Differentiate G (u ) to find the
probability density function g(u)

G u P U u P h X u
h strictly increasing

P X h (u )

h strictly decreasing

P X h 1 (u )

F h (u )

h strictly increasing

1 F h (u )

h strictly decreasing

hence

g u G u

dh

F h

F h 1 u

du
1
dh u
du

h strictly increasing
h strictly decreasing

or
1
dh
(u )
dx
1
g u f h (u )
f x
du
du

Example
Suppose that X has a Normal distribution
with mean and variance 2.
Find the distribution of U = h(x) = eX.
Solution:
2
x

1
2 2
f x
e
2
1
dh u d ln u 1
1
h u ln u and

du
du
u

hence
1
dh
(u )
dx
1
g u f h (u )
f x
du
du

1 1

e
2 u

ln u
2 2

for u 0

This distribution is called the log-normal


distribution

log-normal distribution

The Transfomation Method


Theorem

(many variables)

Let x1, x2,, xn denote random variables with


joint probability density function
f(x1, x2,, xn )
Let u1 = h1(x1, x2,, xn).
u2 = h2(x1, x2,, xn).

un = hn(x1, x2,, xn).


define an invertible transformation from the xs to the us

Then the joint probability density function of


u1, u2,, un is given by:

g u1 ,L , un f x1 ,L , xn

d x1 ,L , xn
d u1 ,L , un

f x1 ,L , xn J
J

d x1 ,L , xn

dx1
du
1

det M
d u1 ,L , un

dxn K
du1
Jacobian of the transformation
where

dx1
dun

dxn
dun

Example

Suppose that x1, x2 are independent with density


functions f1 (x1) and f2(x2)
Find the distribution of
u1 = x1+ x2
u2 = x1 - x2
Solving for x1 and x2 we get the inverse transformation

u1 u2
x1
2
u1 u2
x2
2

The Jacobian of the transformation

dx1
du
1

d x1 , x2 det
J
dx2
d u1 , u2
du
1
1
2
det
1
2

dx1
du2

dx2

du2

1
2 1 1

1
2
2

1
1 1

2
2 2

The joint density of x1, x2 is


f(x1, x2) = f1 (x1) f2(x2)
Hence the joint density of u1 and u2 is:
g u1 , u2 f x1 , x2 J

u1 u2 u1 u2 1
f1
f 2

2
2
2

From

g u1 , u2

u1 u2 u1 u2 1
f1
f 2

2
2
2

We can determine the distribution of u1= x1 + x2

g1 u1

g u , u du
1

u1 u2 u1 u2 1
f1
f 2
du2
2
2
2

u1 u2
u1 u2
dv 1
put v
then
u1 v,

2
2
du2 2

Hence

g1 u1

u1 u2 u1 u2 1
f1
f 2
du2
2
2
2

f v f u
1

v dv

This is called the convolution of the two


densities f1 and f2.

Example: The ex-Gaussian distribution


Let X and Y be two independent random
variables such that:
1. X has an exponential distribution with
parameter .
2. Y has a normal (Gaussian) distribution with
mean and standard deviation .
Find the distribution of U = X + Y.
This distribution is used in psychology as a
model for response time to perform a task.

Now

e x

f1 x

x0

x0

2
x

1
2 2
f2 y
e
2
The density of U = X + Y is :.
g u

f v f u v dv
1

e
0

1
e
2

2
u v

2 2

dv

or

g u
e

2 0

2
u v

2 0

2 2

2
u v 2 2 v

2 2

2 0

e
2

dv
dv

v 2 2 u v u 2 2 v
2

2 2

2
u

e
0

dv

v 2 2 u 2 v
2 2

dv

or

e
2

e
2

u
2

v 2 2 u 2 v
2 2

dv

v 2 2 u 2 v u 2

2 2

dv

u u 2

2 2

1
e
2

u u 2

2 2
2

u u 2

2 2

v 2 2 u 2 v u 2

P V 0

2 2

dv

Where V has a Normal distribution with mean

V u
2

and variance 2.
Hence

g u e

2
u

2 u

Where (z) is the cdf of the standard Normal


distribution

The ex-Gaussian distribution

g(u)

Use of moment generating


functions

Definition
Let X denote a random variable with probability
density function f(x) if continuous (probability mass
function p(x) if discrete)
Then
mX(t) = the moment generating function of X

E etX

etx f x dx if X is continuous

e p x

tx

if X is discrete

The distribution of a random variable X is described by


either
1. The density function f(x) if X continuous (probability mass function p(x) if
X discrete), or
2. The cumulative distribution function F(x), or
3. The moment generating function mX(t)

Properties
1. mX(0) = 1
2. mXk 0 k th derivative of mX t at t 0.

k E X

k E X
3.

x f x dx
x p x
k

X continuous
X discrete

k k
2 2 3 3
mX t 1 1t t t L t L .
2!
3!
k!

4. Let X be a random variable with moment


generating function mX(t). Let Y = bX + a
Then mY(t) = mbX + a(t)
= E(e [bX + a]t) = eatE(e X[ bt ])
= eatmX (bt)
5. Let X and Y be two independent random
variables with moment generating function
mX(t) and mY(t) .
Then mX+Y(t) = E(e [X + Y]t) = E(e Xt e Yt)
= E(e Xt) E(e Yt)
= mX (t) mY (t)

6. Let X and Y be two random variables with


moment generating function mX(t) and mY(t)
and two distribution functions FX(x) and
FY(y) respectively.
Let mX (t) = mY (t) then FX(x) = FY(x).
This ensures that the distribution of a random
variable can be identified by its moment
generating function

M. G. F.s - Continuous distributions

M. G. F.s - Discrete distributions

Moment generating function of the


gamma distribution

mX t E etX

tx
e
f x dx

where

1 x
x e

f x

x0
x0

e f x dx

mX t E e

tX

tx

1 x
e
x e dx

0

1 t x

x e
dx

0
tx

using

or

b a a 1 bx
0 a x e dx 1

a
a 1 bx
0 x e dx ba

then

1 t x
mX t
x e
dx

Moment generating function of the


Standard Normal distribution

mX t E etX
where

tx
e
f x dx

1
f x
e
2

x2

thus

mX t

tx

1
e
2

x2

dx

1
e
2

x2
tx
2

dx

We will use

mX t

t2
2

1
e
2

1
e
2 b

1
e
2

x2
tx
2

x 2 2 tx

1
e
2

2
xa

2 b2

dx 1

dx

dx

x 2 2 tx t 2

t2
2

e dx e

t2
2

1
e
2

2
x t

dx

Note:

2
3
4
x
x
x
e x 1 x L
2! 3! 4!
2

t t

t2
2
2 2
t

2
mX t e 1

L
2
2!
3!
2

2m

t
t
t
t
1 2 3 L m L
2 2 2! 2 3!
2 m!
Also

2 2 3 3
mX t 1 1t t t L
2!
3!

Note:

2
3
4
x
x
x
e x 1 x L
2! 3! 4!
2

t t

t2
2
2 2
t

2
mX t e 1

L
2
2!
3!
2

Also

2m

t
t
t
t
1 2 3 L m L
2 2 2! 2 3!
2 m!
2 2 3 3
mX t 1 1t t t L
2! 3!

k k th moment

x k f x dx

Equating coefficients of tk, we get

k 0 if k is odd and
2 m
1
for k 2m then m
2 m ! 2m !
hence 1 0, 2 1, 3 0, 4 3

Using of moment generating


functions to find the distribution of
functions of Random Variables

Example
Suppose that X has a normal distribution with
mean and standard deviation .
Find the distribution of Y = aX + b
Solution:
2t 2
t
mX t e 2
2
2 at

maX b t e mX at e e
bt

bt

at

2 a 2t 2
a b t
2

= the moment generating function of the normal

distribution with mean a + b and variance a22.

Thus Y = aX + b has a normal distribution with


mean a + b and variance a22.
Special Case: the z transformation

X 1

Z
X
aX b

1

Z a b
0

2
1
2
2 2
Z a 2 1

Thus Z has a standard normal distribution .

Example
Suppose that X and Y are independent each having a
normal distribution with means X and Y , standard
deviations X and Y
Find the distribution of S = X + Y

Solution:
mX t e
mY t e

X2 t 2
X t
2
Y2 t 2
Y t
2

Now

mX Y t mX t mY t e

X2 t 2
X t
2

Y2 t 2
Y t
2

or

m X Y t e

X Y

2
X

Y2 t 2
2

= the moment generating function of the

normal distribution with mean X + Y and


2
2
variance X Y

Thus Y = X + Y has a normal distribution


2
2
with mean X + Y and variance X Y

Example
Suppose that X and Y are independent each having a
normal distribution with means X and Y , standard
deviations X and Y
Find the distribution of L = aX + bY

Solution:
mX t e

X t

X2 t 2

mY t e

Now

Y2 t 2
Y t
2

maX bY t maX t mbY t mX at mY bt


e

X2 at
X at
2

Y2 bt
Y bt
2

or

t
2

maX bY t e

a X bY

2
X

b 2 Y2 t 2
2

= the moment generating function of the

normal distribution with mean aX + bY


2 2
2 2
and variance a X b Y

Thus Y = aX + bY has a normal


distribution with mean aX + bY and
2 2
2 2
variance a X b Y

Special Case:
a = +1 and b = -1.
Thus Y = X - Y has a normal distribution
with mean X - Y and variance

X 1 Y X Y
2

Example (Extension to n independent RVs)


Suppose that X1, X2, , Xn are independent each having a
normal distribution with means i, standard deviations i
(for i = 1, 2, , n)
Find the distribution of L = a1X1 + a1X2 + + anXn

Solution:
mX i t e
Now

i2t 2
i t
2

(for i = 1, 2, , n)

ma1 X1 L an X n t ma1 X1 t L man X n t


mX1 a1t L mX n ant
e

12 a1t
1 a1t
2

L e

n2 an t
n an t
2

or

ma1 X1 L an X n t e

a11 ... an n

2 2
2 2
1 1 ... an n

= the moment generating function of the


normal distribution with mean a11 ... an n
2 2
2 2
and variance a1 1 ... an n
Thus Y = a1X1 + + anXn has a normal
distribution with2 mean
a

+
+
a

and
1
1
n
n
2
2 2
a1 1 ... an n
variance

Special case:
1
a1 a2 L an
n
1 2 L n

12 12 L 12 2
In this case X1, X2, , Xn is a sample from a
normal distribution with mean , and standard
deviations and
1
L X1 X 2 L X n
n

X the sample mean

Thus

Y x a1 x1 ... an xn

n x

x1 ... 1

has a normal distribution with mean


x a11 ... an n
1 ... 1
n
n
and variance

x2 a12 12 ... an2 n2

2
...
n

1
1
2
n
n
n

2
n

Summary
If x1, x2, , xn is a sample from a normal
distribution with mean , and standard
deviations then x the sample mean
has a normal distribution with mean
x
and variance


n
2
x

standard deviation x
n

Sampling distribution
of x

Population

The Law of Large Numbers


Suppose x1, x2, , xn is a sample (independent
identically distributed i.i.d.) from a
distribution with mean ,
x the sample mean
Let
Then

P x 1 as n for all 0
Proof: Previously we used Tchebychevs Theorem.
This assumes (2) is finite.

Proof: (use moment generating functions)


We will use the following fact:
Let
m1(t), m2(t),
denote a sequence of moment generating functions
corresponding to the sequence of distribution
functions:
F1(x) , F2(x),
Let m(t) be a moment generating function
corresponding to the distribution function F(x) then
if
lim mi t m t for all t in an interval about 0.
i

then lim Fi x F x for all x.


i

Let x1, x2, denote a sequence of independent


random variables coming from a distribution with
moment generating function m(t) and distribution
function F(x).
Let Sn = x1 + x2 + + xn then
mSn t mx1 x2 L xn t =mx1 t mx2 t L mxn t
= m t

x1 x2 L xn S n
now x

n
n
t
t
or mx t m 1 t mSn m
Sn
n
n
n

t
now ln mx t ln m
n

t ln m u
u

t
n ln m
n

t
where u
n

t ln m u
Thus lim ln mx t lim

n
u 0
u

m u
t

m
u
m 0

lim
t
t
u 0
1
m 0

using LHopitals rule

Thus lim mx t m t et
n
m t et is the moment generating function of

a random variable that takes on the value with


probability 1.
1 x
i.e. p x
and
x

0 x
and distribution function F x
and
1 x
and lim Fx x F x for all values of x.
n

Now

P x P x
Fx Fx
F F 1 if 0
as n
0 x
since F x
and
1 x

Q.E.D.

The Central Limit theorem


If x1, x2, , xn is a sample from a distribution
with mean , and standard deviations then
if n is large x the sample mean
has a normal distribution with mean
x
and variance


n
2
x

standard deviation x
n

Proof: (use moment generating functions)


We will use the following fact:
Let
m1(t), m2(t),
denote a sequence of moment generating functions
corresponding to the sequence of distribution
functions:
F1(x) , F2(x),
Let m(t) be a moment generating function
corresponding to the distribution function F(x) then
if
lim mi t m t for all t in an interval about 0.
i

then lim Fi x F x for all x.


i

Let x1, x2, denote a sequence of independent


random variables coming from a distribution with
moment generating function m(t) and distribution
function F(x).
Let Sn = x1 + x2 + + xn then
mSn t mx1 x2 L xn t =mx1 t mx2 t L mxn t
= m t

x1 x2 L xn S n
now x

n
n
t
t
or mx t m 1 t mSn m
Sn
n
n
n

x
n
n
Let z

then mz t e

and ln mz t

nt
mx


nt
m

t
n

t n ln m

t
t2
Let u
or n
and n 2 2
u
u
n
t

Then ln mz t

t
n

t n ln m

n
t 2
t2
2 2 2 ln m u
u u
t 2 ln m u u
2

u2

lim ln

Now lim ln mz t
n

u 0

mz t

ln m u u
t2
2 lim
u 0
u2

t2
2 lim
u 0

m u

m u

using L'Hopital's rule

2u

m u m u m u
2

t
2 lim
u 0

m u
2

using L'Hopital's rule again

m u m u m u
m u

t
2 lim
u 0

t m 0 m 0
2

using L'Hopital's rule again

t E x E xi
2

2
2

2
i

thus lim ln mz t
n

t2

2
2

and lim mz t e
n
2

t2
2

Now m t e

t2
2

Is the moment generating function of the standard


normal distribution
Thus the limiting distribution of z is the standard
normal distribution
i.e. lim Fz x
n

1
e
2

u2

du

Q.E.D.

The Central Limit theorem


illustrated

The Central Limit theorem


If x1, x2, , xn is a sample from a distribution
with mean , and standard deviations then
if n is large x the sample mean
has a normal distribution with mean
x
and variance


n
2
x

standard deviation x
n

The Central Limit theorem illustrated


If x1, x2 are independent from the uniform
distirbution from 0 to 1. Find the distribution
x the sample mean
of:
let
S x1 x2
S x1 x2 and x
2
2

Now

G s P S s P x1 x2 s

s0

s2

1 2 s

0 s 1

2
2

1 s 2
s 1

g s G s

s
0 s 1
2 s 1 s 2
0
otherwise

Now:

S
x 12 S aS
2

The density of x is:

dS
h x g S
g 2x 2
dx

2x
0 2x 1
2x

2 2 x 1 2 x 2 2 1 x
0
otherwise
0

0 x 12
1
2 x 1
otherwise

n=1
0

n=2

n=3
0

Distributions of functions of
Random Variables
Gamma distribution, c2 distribution,
Exponential distribution

Therorem
Let X and Y denote a independent random variables
each having a gamma distribution with parameters
(,1) and (,2). Then W = X + Y has a gamma
distribution with parameters (, 1 + 2).
Proof:


mX t

and mY t

Therefore mX Y t mX t mY t
1

t
t t

1 2

Recognizing that this is the moment generating


function of the gamma distribution with parameters
(, 1 + 2) we conclude that W = X + Y has a
gamma distribution with parameters (, 1 + 2).

Therorem (extension to n RVs)


Let x1, x2, , xn denote n independent random variables each
having a gamma distribution with parameters (,i), i = 1, 2, ,
n.
Then W = x1 + x2 + + xn has a gamma distribution with
parameters (, 1 + 2 + + n).
Proof:


mxi t

i 1, 2..., n

Therefore

mx1 x2 ... xn t mx1 t mx2 t ...mxn t


1


...

t
t t

1 2 ... n

Recognizing that this is the moment generating


function of the gamma distribution with parameters
(, 1 + 2 ++ n) we conclude that
W = x1 + x2 + + xn has a gamma distribution with
parameters (, 1 + 2 ++ n).

Therorem
Suppose that x is a random variable having a
gamma distribution with parameters (,).
Then W = ax has a gamma distribution with
parameters (/a, ).

Proof:

mx t

then max t mx at

t
at
a

Special Cases
1. Let X and Y be independent random variables
having an exponential distribution with parameter
then X + Y has a gamma distribution with = 2
and
2. Let x1, x2,, xn, be independent random variables
having a exponential distribution with parameter
then S = x1+ x2 ++ xn has a gamma distribution
with = n and

3. Let x1, x2,, xn, be independent random variables


having a exponential distribution with parameter then

S x1 K xn
x
has a gammandistributionnwith = n and n

Distribution of x
population Exponential distribution

Another illustration of the central limit theorem

Special Cases -continued


4. Let X and Y be independent random variables
having a 2 distribution with 1 and 2 degrees of
freedom respectively then X + Y has a 2
distribution with degrees of freedom 1 + 2.

5. Let x1, x2,, xn, be independent random variables


having a 2 distribution with 1 , 2 ,, n degrees
of freedom respectively then x1+ x2 ++ xn has a
2 distribution with degrees of freedom 1 ++ n.
Both of these properties follow from the fact that a
2 random variable with degrees of freedom is a
random variable with = and = /2.

Recall
If z has a Standard Normal distribution then z2 has a
2 distribution with 1 degree of freedom.
Thus if z1, z2,, z are independent random variables
each having Standard Normal distribution then

U z z ... z
2
1

2
2

has a 2 distribution with degrees of freedom.

Therorem
Suppose that U1 and U2 are independent random variables and
that U = U1 + U2 Suppose that U1 and U have a 2 distribution
with degrees of freedom 1and respectively. (1 < )
Then U2 has a 2 distribution with degrees of freedom 2 = -1
Proof:

Now mU1 t 1
2 t
1
2

v1
2

and mU t 1
2 t
1
2

v
2

Also mU t mU1 t mU 2 t
Hence mU 2 t

mU t

mU1 t

1 t
2
1
2

v
2

1 t
2
1
2

v
1
2

1
2 t
1
2

Q.E.D.

v
v
1

2 2

Tables for Standard Normal distri


bution

You might also like