You are on page 1of 10

18.

440: Lecture 23
Sums of independent random variables
Scott Sheeld
MIT
18.440 Lecture 23

Summing two random variables

Say we have independent random variables X and Y and we


know their density functions f
X
and f
Y
.

Now lets try to nd F


X+Y
(a) =P{X +Y a}.

This is the integral over {(x,y) :x+y a} of


f(x,y) =f
X
(x)f
Y
(y). Thus,
ay
P{X +Y a}= f
X
(x)f
Y
(y)dxdy

= F
X
(ay)f
Y
(y)dy.

Dierentiating both sides gives


f
X+Y
(a)=
d
da

F
X
(ay)f
Y
(y)dy =

f
X
(ay)f
Y
(y)dy.
Latter formula makes some intuitive sense. Were integrating
over the set of x,y pairs that add up to a.
18.440 Lecture 23
Independent identically distributed (i.i.d.)

The abbreviation i.i.d. means independent identically


distributed.

Ranks among the most beautiful, powerful and profoundly


signicant abbreviations in all of probability theory.

Worth remembering.
18.440 Lecture 23
Summing i.i.d. uniform random variables

Suppose that X and Y are i.i.d. and uniform on [0,1]. So


f
X
=f
Y
= 1 on [0,1].

What is the probability density function of X +Y?

f
X+Y
(a) =

f
X
(ay)f
Y
(y)dy =

1
f
X
(ay) which is
0
the length of [0,1][a1,a].

Thats a when a[0,1] and 2a when a[0,2] and 0


otherwise.
18.440 Lecture 23

Review: summing i.i.d. geometric random variables

A geometric random variable X with parameter p has


P{X =k}=(1p)
k1
p for k 1.

Sum Z of n independent copies of X?

We can interpret Z as time slot where nth head occurs in


i.i.d. sequence of p-coin tosses.

So Z is negative binomial (n,p). So


P{Z =k}=
k1
p
n1
(1p)
kn
p.
n1
18.440 Lecture 23

Summing i.i.d. exponential random variables

SupposeX
1
, . . .X
n
arei.i.d. exponentialrandomvariables with
parameter . So f
X
i
(x) =e
x
on [0,) for all 1i n.

What is the law of Z =


n
i =1
X
i
?

We claimed in an earlier lecture that this was a gamma


distribution with parameters (,n).
e
y
(y)
n1

So f
Z
(y) =
(n)
.

We argued this point by taking limits of negative binomial


distributions. Can we check it directly?

By induction, would suce to show that a gamma (,1) plus


an independent gamma (,n) is a gamma (,n+1).
18.440 Lecture 23


Summing independent gamma random variables

Say X is gamma (,s), Y is gamma (,t), and X and Y are


independent.

Intuitively, X is amount of time till we see s events, and Y is


amount of subsequent time till we see t more events.

So f
X
(x) =
e
x
(x)
s1
and f
Y
(y) =
e
y
(y)
t1
.
(s) (t)

Now f
X+Y
(a) =

f
X
(ay)f
Y
(y)dy.

Up to an a-independent multiplicative constant, this is


a a
e
(ay)
(ay)
s1
e
y
y
t1
dy =e
a
(ay)
s1
y
t1
dy.
0 0

Letting x =y/a, this becomes


e
a
a
s+t1

1
(1x)
s1
x
t1
dx.
0

This is (up to multiplicative constant) e


a
a
s+t1
. Constant
must be such that integral from to is 1. Conclude
that X +Y is gamma (,s+t).
18.440 Lecture 23


Summing two normal variables

X is normal with mean zero, variance


1
2
, Y is normal with
mean zero, variance
2
2
.
2 2
x y

f
X
(x) =

2
1

1
e
2
1
2
and f
Y
(y) =

2
1

2
e
2
2
2
.

We just need to compute f


X+Y
(a) =

f
X
(ay)f
Y
(y)dy.

We could compute this directly.

Or we could try to argue with a picture that this is an


2
a
a-independent constant times e
2(
1
2
+
2
2
)
.

Or use fact that if A


i
{1,1} are i.i.d. coin tosses then

1
N

2
N
A
i
is approximately normal with variance
2
when
i =1
N is large.

Generally: if independent random variables X


j
are normal
(
j
,
j
2
) then
j
n
=1
X
j
is normal (
j
n
=1

j
,
j
n
=1

j
2
).

Variances and expectations are both additive.


18.440 Lecture 23
Other sums

Sum of an independent binomial (m,p) and binomial (n,p)?

Yes, binomial (m+n,p). Can be seen from coin toss


interpretation.

Sum of independent Poisson


1
and Poisson
2
?

Yes, Poisson
1
+
2
. Can be seen from Poisson point process
interpretation.
18.440 Lecture 23
MIT OpenCourseWare
MIT OpenCourseWare
http://ocw.mit.edu
18.440 Probability and Random Variables
Spring 20011
For information about citing these mateials or our Terms of Use, visit: http://ocw.mit.edu/terms.

You might also like