You are on page 1of 7

Sums of independent random variables

Statlect The Digital Textbook

Go

Index > Additional topics in probability theory

Sums of independent random variables


This lecture discusses how to derive the distribution of the sum of two independent random variables. We explain first
how to derive the distribution function of the sum and then how to derive its probability mass function (if the summands
are discrete) or its probability density function (if the summands are continuous).

Distribution function of a sum


The following proposition characterizes the distribution function of the sum in terms of the distribution functions of the
two summands:
Proposition Let and be two independent random variables and denote by
distribution functions. Let:

and denote the distribution function of

by

and

their

. The following holds:

or:

Proof
Example Let

be a uniform random variable with support

and another uniform random variable, independent of


function:

The distribution function of

is:

http://www.statlect.com/sumdst1.htm[07/11/2015 15:45:33]

and probability density function:

, with support

and probability density

Sums of independent random variables

The distribution function of

There are four cases to consider:


1. If

, then:

2. If

, then:

3. If

, then:

4. If

, then:

http://www.statlect.com/sumdst1.htm[07/11/2015 15:45:33]

is:

Sums of independent random variables

Combining these four possible cases, we obtain:

Probability mass function of a sum


When the two summands are discrete random variables, the probability mass function of their sum can be derived as
follows:
Proposition Let

and

be two independent discrete random variables and denote by

respective probability mass functions and by

and denote the probability mass function of

and

by

and

their supports. Let:

. The following holds:

or:

Proof
The two summations above are called convolutions (of two probability mass functions).
Example Let

be a discrete random variable with support

and another discrete random variable, independent of


function:

http://www.statlect.com/sumdst1.htm[07/11/2015 15:45:33]

, with support

and probability mass function:

and probability mass

their

Sums of independent random variables

Define

Its support is:

The probability mass function of

Evaluated at

, it is:

Evaluated at

, it is:

, evaluated at

Therefore, the probability mass function of

is:

is:

Probability density function of a sum


When the two summands are absolutely continuous random variables, the probability density function of their sum can
be derived as follows:
Proposition Let

and

be two independent absolutely continuous random variables and denote by

their respective probability density functions. Let:

and denote the probability density function of

http://www.statlect.com/sumdst1.htm[07/11/2015 15:45:33]

by

. The following holds:

and

Sums of independent random variables

or:

Proof
The two integrals above are called convolutions (of two probability density functions).
Example Let

be an exponential random variable with support

and another exponential random variable, independent of


function:

Define:

The support of

When

is:

, the probability density function of

Therefore, the probability density function of

http://www.statlect.com/sumdst1.htm[07/11/2015 15:45:33]

is:

is:

, with support

and probability density function:

and probability density

Sums of independent random variables

More details
Sum of n independent random variables
We have discussed above how to derive the distribution of the sum of two independent random variables. How do we
derive the distribution of the sum of more than two mutually independent random variables? Suppose
are

mutually independent random variables and let

The distribution of

, ...,

be their sum:

can be derived recursively, using the results for sums of two random variables given above:

1. first, define:

and compute the distribution of

2. then, define:

and compute the distribution of

3. and so on, until the distribution of

can be computed from:

Solved exercises
Below you can find some exercises with explained solutions:
1. Exercise set 1

The book
Most learning materials found on this website are now available in a traditional textbook format.
Learn more

Featured pages

Main sections

Glossary entries

Student t distribution

Mathematical tools

Alternative hypothesis

Gamma function

Fundamentals of probability

Convolutions

Normal distribution

Additional topics in probability

Precision matrix

http://www.statlect.com/sumdst1.htm[07/11/2015 15:45:33]

Sums of independent random variables

Point estimation

Probability distributions

Markov inequality

Delta method

Asymptotic theory

Factorial

Maximum likelihood

Fundamentals of statistics

Type I error

Explore

About

Share

Law of Large Numbers

About Statlect

Wald test

Contacts

Gamma distribution

Privacy policy and terms of use

Convergence in distribution

Website map

http://www.statlect.com/sumdst1.htm[07/11/2015 15:45:33]

You might also like