You are on page 1of 7

Lecture 3

Gaussian Probability Distribution


Introduction
l Gaussian probability distribution is perhaps the most used distribution in all of science.
u also called “bell shaped curve” or normal distribution
l Unlike the binomial and Poisson distribution, the Gaussian is a continuous distribution:
( y-m) 2
1 -
P(y) = e 2s 2
s 2p
m = mean of distribution (also at the same place as mode and median)
s2 = variance of distribution
† y is a continuous variable (-∞ £ y £ ∞)
l Probability (P) of y being in the range [a, b] is given by an integral:
( y-m) 2
1 b -
P(a < y < b) = Úe 2s 2 dy
s 2p a
Karl Friedrich Gauss 1777-1855
u The integral for arbitrary a and b cannot be evaluated analytically
+ The value of the integral has to be looked up in a table (e.g. Appendixes A and B of Taylor).


(x - m )2

P(x) 1 -
2
p(x) = e 2s
gaussian
s 2p

Plot of Gaussian pdf

x
K.K. Gan L3: Gaussian Probability Distribution 1
l The total area under the curve is normalized to one.
+ the probability integral:

( y-m) 2
1 • -
P(-• < y < •) = Úe 2s 2 dy = 1
s 2p -•
l We often talk about a measurement being a certain number of standard deviations (s) away
from the mean (m) of the Gaussian.
+ We can associate a probability for a measurement to be |m - ns|

from the mean just by calculating the area outside of this region.
ns Prob. of exceeding ±ns
0.67 0.5
1 0.32 It is very unlikely (< 0.3%) that a
2 0.05 measurement taken at random from a
3 0.003 Gaussian pdf will be more than ± 3s
4 0.00006 from the true mean of the distribution.
Relationship between Gaussian and Binomial distribution
l The Gaussian distribution can be derived from the binomial (or Poisson) assuming:
u p is finite
u N is very large
u we have a continuous variable rather than a discrete variable
l An example illustrating the small difference between the two distributions under the above conditions:
u Consider tossing a coin 10,000 time.
p(heads) = 0.5
N = 10,000
K.K. Gan L3: Gaussian Probability Distribution 2
n For a binomial distribution:
mean number of heads = m = Np = 5000
standard deviation s = [Np(1 - p)]1/2 = 50
+ The probability to be within ±1s for this binomial distribution is:
5000+50 10 4 ! m 10 4 -m
P= Â 4
0.5 0.5 = 0.69
m=5000-50 (10 - m)!m!
n For a Gaussian distribution:
( y-m) 2
1 m +s - 2
P(m - s < y < m + s ) = Ú e 2s dy ª 0.68
s 2p m-s
†+ Both distributions give about the same probability!
Central Limit Theorem
l Gaussian
† distribution is important because of the Central Limit Theorem
l A crude statement of the Central Limit Theorem:
u Things that are the result of the addition of lots of small effects tend to become Gaussian.
l A more exact statement:
u Let Y1, Y2,...Yn be an infinite sequence of independent random variables Actually, the Y’s can
each with the same probability distribution. be from different pdf’s!
u Suppose that the mean (m) and variance (s2) of this distribution are both finite.
+ For any numbers a and b:

È Y +Y + ...Yn - nm ˘ 1 b - 12 y 2
lim PÍa < 1 2 < b˙ = Úe dy
nÆ• Î s n ˚ 2p a
+ C.L.T. tells us that under a wide range of circumstances the probability distribution
that describes the sum of random variables tends towards a Gaussian distribution
as the number of terms in the sum Æ∞.
† K.K. Gan L3: Gaussian Probability Distribution 3
+ Alternatively:
È Y -m ˘ È Y -m ˘ 1 b - 12 y 2
lim PÍa < < b˙ = lim PÍa < < b˙ = Úe dy
nÆ• Î s/ n ˚ nÆ• Î sm ˚ 2p a
n sm is sometimes called “the error in the mean” (more on that later).

l For CLT to be valid:


u m and s of pdf must be finite.
†u No one term in sum should dominate the sum.
l A random variable is not the same as a random number.
u Devore: Probability and Statistics for Engineering and the Sciences:
+ A random variable is any rule that associates a number with each outcome in S
n S is the set of possible outcomes.

l Recall if y is described by a Gaussian pdf with m = 0 and s = 1 then

the probability that a < y < b is given by:


1 b - 12 y 2
P ( a < y < b) = Úe dy
2p a
l The CLT is true even if the Y’s are from different pdf’s as long as
the means and variances are defined for each pdf!
u See Appendix of Barlow for a proof of the Central Limit Theorem.

K.K. Gan L3: Gaussian Probability Distribution 4


l Example: A watch makes an error of at most ±1/2 minute per day.
After one year, what’s the probability that the watch is accurate to within ±25 minutes?
u Assume that the daily errors are uniform in [-1/2, 1/2].
n For each day, the average error is zero and the standard deviation 1/√12 minutes.
n The error over the course of a year is just the addition of the daily error.
n Since the daily errors come from a uniform distribution with a well defined mean and variance
+ Central Limit Theorem is applicable:
È Y1 +Y2 + ...Yn - nm ˘ 1 b - 12 y 2
lim PÍa < < b˙ = Úe dy
nÆ• Î s n ˚ 2p a
+ The upper limit corresponds to +25 minutes:
Y +Y + ...Yn - nm 25 - 365 ¥ 0
b= 1 2 = = 4.5
s n 1 365
† + The lower limit corresponds to12-25 minutes:
Y +Y + ...Yn - nm -25 - 365 ¥ 0
a= 1 2 = = -4.5
s n 1 365
12
† + The probability to be within ± 25 minutes:
1 4.5 - 12 y 2
P= Ú e dy = 0.999997 = 1- 3¥10 -6
2p
† + less than three-4.5
in a million chance that the watch will be off by more than 25 minutes in a year!

K.K. Gan L3: Gaussian Probability Distribution 5


l Example: Generate a Gaussian distribution using random numbers.
u Random number generator gives numbers distributed uniformly in the interval [0,1]
n m = 1/2 and s2 = 1/12
u Procedure:
n Take 12 numbers (ri) from your computer’s random number generator
n Add them together
n Subtract 6
+ Get a number that looks as if it is from a Gaussian pdf!
È Y +Y2 + ...Yn - nm ˘
PÍa < < b˙
Î s n ˚ A) 5000 random numbers B) 5000 pairs (r1 + r2)
È 12 ˘ of random numbers
1
 ri -12 ⋅ 2
Í ˙
= PÍa < i=1 < b˙
Í 1 ⋅ 12 ˙
12
ÍÎ ˙˚
È 12 ˘
= PÍ-6 < Â ri - 6 < 6˙
Î i=1 ˚ C) 5000 triplets (r1 + r2 + r3) D) 5000 12-plets (r1 + r2 +…r12)
of random numbers of random numbers.
1 6 - 12 y 2 E) 5000 12-plets
= Úe dy
2p -6 E (r1 + r2 +…r12 - 6) of
random numbers.
Thus the sum of 12 uniform random Gaussian
numbers minus 6 is distributed as if it came m = 0 and s = 1
from
† a Gaussian pdf with m = 0 and s = 1.
-6 0 +6

K.K. Gan L3: Gaussian Probability Distribution 6


l Example: The daily income of a "card shark" has a uniform distribution in the interval [-$40,$50].
What is the probability that s/he wins more than $500 in 60 days?
u Lets use the CLT to estimate this probability:
È Y +Y + ...Yn - nm ˘ 1 b - 12 y 2
lim PÍa < 1 2 < b˙ = Úe dy
nÆ• Î s n ˚ 2p a
u The probability distribution of daily income is uniform, p(y) = 1.
+ need to be normalized in computing the average daily winning (m) and its standard deviation (s).
50
† Ú yp(y)dy 1 [50 2
2
- (-40)2 ]
m = -40
50
= =5
50 - (-40)
Ú p(y)dy
-40
50
2
Ú y p(y)dy 1 [50 3 - (-40) 3 ]
2 2 3
s = -4050 -m = - 25 = 675
50 - (-40)
Ú p(y)dy
-40
u The lower limit of the winning is $500:
Y +Y + ...Yn - nm 500 - 60 ¥ 5 200
a= 1 2 = = =1
† s n 675 60 201
u The upper limit is the maximum that the shark could win (50$/day for 60 days):
Y +Y + ...Yn - nm 3000 - 60 ¥ 5 2700
b= 1 2 = = = 13.4
s n 675 60 201

1 13.4 - 12 y 2 1 • - 12 y 2
P= Ú e dy ª Úe dy = 0.16
2p 1 2p 1
+ 16% chance to win > $500 in 60 days
K.K. Gan L3: Gaussian Probability Distribution 7

You might also like