You are on page 1of 49

2 Probability

 Basic Definitions: Events, Sample Space,


and Probabilities
 Basic Rules for Probability

 Conditional Probability

 Independence of Events

 Combinatorial Concepts

 The Law of Total Probability and Bayes’

Theorem
 Random variables
Probability

 A measure of uncertainty
 A measure of the strength of belief in
the occurrence of an uncertain event
 A measure of the degree of chance or
likelihood of occurrence of an
uncertain event
 Measured by a number between 0
and 1 (or between 0% and 100%)
Types of Probability
 Objective or Classical Probability Subjective Probability
 based on equally-likely
– based on personal beliefs,
events
experiences, prejudices,
 based on long-run relative
intuition - personal judgment
frequency of events
 not based on personal beliefs – different for all observers
 is the same for all observers
(subjective)
(objective) – examples: Super Bowl,
 examples: toss a coins, throw elections, new product
a die, pick a card introduction, snowfall
Basic Definitions
 Set - a collection of elements or objects of interest
 Empty set (denoted by )
 a set containing no elements

 Universal set (denoted by S)


 a set containing all possible elements

 Complement (Not). The complement of A is  A


 a set containing all elements of S not in A

• Intersection (And)  A  B
a set containing all elements in both A and B

• Union (Or)  A  B
– a set containing all elements in A or B or both
Basic Definitions (cont)

• Mutually exclusive or disjoint sets


- sets having no elements in common, having
no intersection, whose intersection is the
empty set
• Collectively exhaustive
– If the list of outcomes include all of
possible outcomes
• Partition
– a collection of mutually exclusive sets
which together include all possible
elements, whose union is the universal set
Sets: Diagrams

A A A B
B A B
A

BB
AA BB
AA

A1 A3
A2

A4 A5

Partition
Experiment
• Process that leads to one of several possible
outcomes *, e.g.:
 Coin toss
• Heads,Tails
 Throw die
• 1, 2, 3, 4, 5, 6
 Pick a card
 AH, KH, QH, ...
• Each trial of an experiment has a single observed
outcome.
• The precise outcome of a random experiment is
unknown before a trial.

* Also called a basic outcome, elementary event, or simple event


Events
 Sample Space or Event Set
 Set of all possible outcomes (universal set) for a given
experiment
 E.g.: Throw die

 S = (1,2,3,4,5,6)
 Event
 Collection of outcomes having a common characteristic
 E.g.: Even number

 A = (2,4,6)
 Event A occurs if an outcome in the set A occurs
 Probability of an event
 Sum of the probabilities of the outcomes of which it
consists
 P(A) = P(2) + P(4) + P(6)
Equally-likely Probabilities
• For example:
 Throw a die
• Six possible outcomes (1,2,3,4,5,6)
• If each is equally-likely, the probability of each is 1/6 = .
1667 = 16.67%
1
P ( e) 
n( S )
•Probability of each equally-likely outcome is 1 over the
number of possible outcomes
 Event A (even number)
• P(A) = P(2) + P(4) + P(6) = 1/6 + 1/6 + 1/6 = 1/2
• P ( A )   P ( e) for e in A
n( A ) 3 1
  
n( S ) 6 2
Pick a Card: Sample Space
Hearts Diamonds Clubs Spades
A A A A
Union of K K K K
Event ‘Ace’
Events ‘Heart’ Q Q Q Q
J J J J n ( Ace ) 4 1
and ‘Ace’ 10 10 10 10
P ( Ace )   
P ( Heart  Ace )  n(S ) 52 13
9 9 9 9
8 8 8 8
n ( Heart  Ace ) 7 7 7 7
 6 6 6 6
n(S ) 5 5 5 5
4 4 4 4
16 4 3 3 3 3

2 2 2 2
52 13

The intersection of the


events ‘Heart’ and ‘Ace’
Event ‘Heart’
comprises the single point
n ( Heart ) 13 1
P ( Heart )    circled twice: the ace of hearts
n(S ) 52 4 n ( Heart  Ace ) 1
P ( Heart  Ace )  
n(S ) 52
Basic Rules for Probability (1)
 Range of
Range of Values
Values0  P( A)  1

 Complements -- Probability
Complements Probability of
of not
not A
A
P( A )  1  P( A)
 Intersection -- Probability
Intersection Probability of
of both
bothAA and
and B
B

P( A  B)  n( A  B)
n( S )
Mutuallyexclusive
 Mutually exclusiveevents
events (A
(Aand
andC)
C) ::

P( A  C )  0
Basic Rules for Probability (2)
•• Union --Probability
Union Probabilityof
ofAAor
or BBor
orboth
both

P( A  B)  n( A  B)  P( A)  P( B)  P( A  B)
n( S )

Mutually
Mutuallyexclusive
exclusiveevents:
P( A  C)  0 so
events:
P( A  C)  P( A)  P(C)

•• ConditionalProbability
Conditional Probability -- Probability
Probabilityof
ofAAgiven
givenBB
P( A B)  P( A  B)
P( B)

P( A B)  P( A)
Independent
Independentevents:
events: P( B A)  P( B)
Conditional Probability
Condition Probability P(A|B) is probability of A given B
Rulesof
Rules ofconditional
conditionalprobability:
probability:

P( A B)  P( A  B) so P( A  B)  P( A B) P( B)
P( B)
 P( B A) P( A)

If events A and D are statistically independent:

P ( A D )  P ( A)
so P( A  D)  P( A) P( D)
P ( D A)  P ( D )
Example (Contingency Table)

Counts
AT& T IBM Total

Telecommunication 40 10 50 Probability that a project


Computers 20 30 50
is undertaken by IBM
given it is a
Total 60 40 100
telecommunications
Probabilities project:
P ( IBM T )
AT& T IBM Total P ( IBM T ) 
P (T )
Telecommunication .40 .10 .50
.10
 .2
Computers .20 .30 .50 .50
Total .60 .40 1.00
Independence of Events

Conditions for the statistical independence of events A and B:


P ( A B )  P ( A)
P ( B A)  P ( B )
and
P ( A  B )  P ( A) P ( B )
P ( Ace  Heart ) P ( Heart  Ace )
P ( Ace Heart )  P ( Heart Ace ) 
P ( Heart ) P ( Ace )
1 1
1 1
 52   P ( Ace )  52   P ( Heart )
13 13 4 4
52 52
4 13 1
P ( Ace  Heart )    P ( Ace ) P ( Heart )
52 52 52
Combinatorial Concepts
Consider a pair of six-sided dice. There are six possible outcomes
from throwing the first die (1,2,3,4,5,6) and six possible outcomes
from throwing the second die (1,2,3,4,5,6). Altogether, there are
6*6=36 possible outcomes from throwing the two dice.
In general, if there are n events and the event i can happen in
Ni possible ways, then the number of ways in which the
sequence of n events may occur is N1N2...Nn.
 Pick 5 cards from a deck of 52 - with  Pick 5 cards from a deck of 52 - without
replacement replacement
 52*52*52*52*52=525 380,204,032 different possible  52*51*50*49*48 = 311,875,200 different possible
outcomes outcomes
Factorial
How many ways can you order the 3 letters A, B, and C?
There are 3 choices for the first letter, 2 for the second, and 1 for
the last, so there are 3*2*1 = 6 possible ways to order the three
letters A, B, and C.

How many ways are there to order the 6 letters A, B, C, D, E,


and F? (6*5*4*3*2*1 = 720)

Factorial: For any positive integer n, we define n factorial as:


n(n-1)(n-2)...(1). We denote n factorial as n!.
The number n! is the number of ways in which n objects can
be ordered. By definition 1! = 1.
Permutations

What if we chose only 3 out of the 6 letters A, B, C, D, E, and F?


There are 6 ways to choose the first letter, 5 ways to choose the
second letter, and 4 ways to choose the third letter (leaving 3
letters unchosen). That makes 6*5*4=120 possible orderings or
permutations.

Permutations are the possible ordered selections of r objects out


of a total of n objects. The number of permutations of n objects
taken r at a time is denoted nPr.
P  n!
n r ( n  r )!

For example:
6! 6! 6 * 5 * 4 * 3 * 2 * 1
P    6 * 5 * 4  120
(6  3)! 3! 3 * 2 *1
6 3
Combinations

Suppose that when we picked 3 letters out of the 6 letters A, B, C, D, E, and F


we chose BCD, or BDC, or CBD, or CDB, or DBC, or DCB. (These are the
6 or 3! permutations or orderings of the 3 letters B, C, and D.) But these are
orderings of the same combination of 3 letters. How many combinations of 6
different letters, taking 3 at a time, are there?
Combinations are the possible selections of r items from a group of n items n
regardless of the order of selection. The number of combinations is denoted  r
and is read n choose r. An alternative notation is nCr. We define the number
of combinations of r out of n elements as:
 n n!

  n rC 
 r r!(n  r)!
For example:
 n 6! 6! 6 * 5 * 4 * 3 * 2 * 1 6 * 5 * 4 120
   6 C3       20
 r 3!( 6  3)! 3!3! (3 * 2 * 1)( 3 * 2 * 1) 3 * 2 * 1 6
The Law of Total Probability and
Bayes’ Theorem

P( A)  P( A  B)  P( A  B )

In terms of conditional probabilities:


P( A)  P( A  B)  P( A  B )
 P( A B) P( B)  P( A B ) P( B )

More generally (where Bi make up a partition):


P( A)   P( A  B )
i
  P( AB ) P( B )
i i
Example - The Law of Total
Probability

Event U: Stock market will go up in the next year


Event W: Economy will do well in the next year
P(U W ) .75
P(U W ) .30
P(W ) .80  P(W ) 1.8 .2
P(U )  P(U W )  P(U W )
 P(U W )P(W )  P(U W )P(W )
 (.75)(.80)  (.30)(.20)
.60 .06 .66
Bayes’ Theorem
• Bayes’ theorem enables you, knowing just a
little more than the probability of A given B, to
find the probability of B given A.
• Based on the definition of conditional
probability and the law of total probability.
P ( A B)
P ( B A) 
P ( A)
P ( A B) Applying the law of total
 probability to the denominator
P ( A B)  P ( A B )
P ( A B) P ( B) Applying the definition of

P ( A B ) P ( B)  P ( A B ) P ( B ) conditional probability throughout
Random Variables

 A real number is assigned to every outcome


or event in an experiment.
 Outcome is numerical or quantitative 
outcome number can be random variable.
 Outcome is not number  RV so that every
outcome corresponding with a unique
number.
 Discrete RV
 Continuous RV
Example
Consider the experiment of tossing two six-sided dice. There are 36 possible
outcomes. Let the random variable X represent the sum of the numbers on
the two dice:

x P(x)* Probability Distribution of Sum of Two Dice


2 1/36
0.17
3 2/36
2 3 4 5 6 7 4 3/36
0.12
1,1 1,2 1,3 1,4 1,5 1,6 8 5 4/36

p(x)
2,1 2,2 2,3 2,4 2,5 2,6 9 6 5/36
3,1 3,2 3,3 3,4 3,5 3,6 10 7 6/36 0.07

4,1 4,2 4,3 4,4 4,5 4,6 11 8 5/36


9 4/36 0.02
5,1 5,2 5,3 5,4 5,5 5,6 12 10 3/36
2 3 4 5 6 7 8 9 10 11 12
x
6,1 6,2 6,3 6,4 6,5 6,6 11 2/36
12 1/36
* Note that: P ( x )  ( 6  ( 7  x ) 2 ) / 36
1
Example
Given Probability Distribution
The Probability Distribution of the Number of Switches
x P(x)
0.4
0 0.1
1 0.2 0.3

2 0.3

P(x)
3 0.2 0.2

4 0.1 0.1
5 0.1
1 0.0
0 1 2 3 4 5
x

Probability of more than 2:


P(X > 2) = P(3) + P(4) + P(5) = 0.2 + 0.1 + 0.1 = 0.4
Probability of at least 1:
P(X 1) = 1 - P(0) = 1 - 0.1 = .9
Discrete and Continuous Random
Variables

AAdiscrete
discreterandom
randomvariable:
variable:
hasaacountable
 has countablenumber
numberofofpossible
possiblevalues
values
hasdiscrete
 has discretejumps
jumpsbetween
betweensuccessive
successivevalues
values
hasmeasurable
 has measurableprobability
probabilityassociated
associatedwith
withindividual
individualvalues
values
counts
 counts

AAcontinuous
continuousrandom
randomvariable:
variable:
hasan
 has anuncountable
uncountableinfinite
infinitenumber
numberofofpossible
possiblevalues
values
movescontinuously
 moves continuouslyfrom
fromvalue
valueto
tovalue
value
hasno
 has nomeasurable
measurableprobability
probabilityassociated
associatedwith
witheach
eachvalue
value
measures(e.g.:
 measures (e.g.:height,
height,weight,
weight,speed,
speed,value,
value,duration,
duration,
length)
length)
Discrete Probability Distributions -
(Probability Mass Function)

The probability distribution of a discrete random


variable X sometime is called Probability Mass
function, must satisfy the following two conditions.
1. P ( x )  0 for all values of x.

2.  P ( x )  1
all x

 Corollary : 0  P ( X )  1
Cumulative Distribution Function
(Probability Distribution)

The cumulative distribution function, F(x), of a discrete


random variable X is:
F (x)  P ( X  x)   P(i )
all i  x

x P(x) F(x)
0 0.1 0.1
1 0.2 0.3
2 0.3 0.6
3 0.2 0.8
4 0.1 0.9
5 0.1 1.0
1
Example
The probability that at most three will occur:
x P(x) F(x)
0 0.1 0.1
1 0.2 0.3 The Probability That at Most Thre e S witc hes W ill Occur

2 0.3 0.6 0 .4

3 0.2 0.8 P ( X  3)  F ( 3)
0 .3

4 0.1 0.9
5 0.1 1.0
P ( x)

0 .2

1 0 .1

0 .0

0 1 2 3 4 5

x
Expected Values of Discrete
Random Variables
The mean of a probability distribution is a measure of its centrality or
location.

The mean is also known as the expected value (or expectation) of a random
variable, because it is the value that is expected to occur, on average.

x P(x) xP(x)
The expected value of a discrete 0 0.1 0.0
random variable X is equal to the sum 1 0.2 0.2
2 0.3 0.6
of each value of the random variable 3 0.2 0.6
multiplied by its probability. 4 0.1 0.4
5 0.1 0.5
  E ( X )   xP ( x ) 1.0 2.3 = E(X)=
all x
Expected Value of a Function of a
Discrete Random Variables
The expected value of a function of a discrete random variable X is:
E[ g ( X )]   g ( x) P( x)
all x

Example: Monthly sales of a certain Number


product are believed to follow the of items, x P(x) xP(x) g(x) g(x)P(x)
probability distribution given below. 5000 0.2 1000 2000 400
6000 0.3 1800 4000 1200
Suppose the company has a fixed
7000 0.2 1400 6000 1200
monthly production cost of $8000 and 8000 0.2 1600 8000 1600
that each item brings $2. Find the 9000 0.1 900 10000 1000
expected monthly profit from product 1.0 6700 5400
sales.

E[ g ( X )]   g ( x) P( x)  5400
all x
Variance and Standard Deviation
of a Random Variable

The variance of a random variable is the expected


squared deviation from the mean:
  V ( X )  E [( X   ) ]   ( x   ) P ( x )
2 2 2

all x
2
   
 E ( X )  [ E ( X )]   x P ( x )   xP ( x )
2 2 2

 all x   all x 

The standard deviation of a random variable is the


square root of its variance:   SD( X )  V ( X )
The Binomial Distribution
Bernoulli trials are a sequence of n identical trials satisfying the
following conditions:
1. Each trial has two possible outcomes, called success *and failure.
The two outcomes are mutually exclusive and exhaustive.
2. The probability of success, denoted by p, remains constant from trial
to trial. The probability of failure is denoted by q, where q = 1-p.
3. The n trials are independent. That is, the outcome of any trial does
not affect the outcomes of the other trials.

A random variable, X, that counts the number of successes in n


Bernoulli trials, where p is the probability of success* in any given trial,
is said to follow the binomial probability distribution with parameters
n (number of trials) and p (probability of success). We call X the
binomial random variable.
Binomial Probabilities
(Continued)
In general:

1. The probability of a given 2. The number of different sequences


sequence of x successes of n trials that result in exactly x
out of n trials with successes is equal to the number of
probability of success p choices of x elements out of a total of
and probability of failure q n elements. This number is denoted:
is equal pxq(n-x)
n n!
nCx    
 x  x! ( n  x )!
The Binomial Probability
Distribution
The binomial probability distribution: Number of
successes, x Probability P(x)
n!
 n x ( n  x ) n! 0 p 0 q (n 0)
P( x)    p q  px q ( n  x) 0 !( n  0)!
 x x!( n  x)! 1
n!
p 1 q ( n  1)
where : 1!( n  1)!
n!
p is the probability of success in a single trial, 2 p 2 q (n 2)
2 !( n  2)!
q = 1-p, n!
n is the number of trials, and 3 p 3 q ( n  3)
3!( n  3)!
x is the number of successes.  
n!
n p n q (n n)
n !( n  n)!
1.00
Mean, Variance, and Standard
Deviation of the Binomial Distribution

Meanof
Mean ofaabinomial
binomialdistribution:
distribution:

EE((XX))np
np
For example, if H counts the number of
heads in five tosses of a fair coin:
Varianceof
Variance ofaabinomial
binomialdistribution:
distribution:
 H  E (H )  (5)(.5)  2.5
 VV((XX))npq
npq
2

 2H  V (H )  (5)(.5)(.5)  0.5
2

Standarddeviation
Standard deviationofofaabinomial
binomialdistribution:
distribution:  H  SD(H )  0.5 .7071

==SD(X)
SD(X)== npq
npq
Continuous Random Variables
AAcontinuous
continuousrandom
randomvariable
variableisisaarandom
randomvariable
variablethat
thatcan
cantake
takeon
onany
anyvalue
valueininan
an
intervalofofnumbers.
interval numbers.

Theprobabilities
The probabilitiesassociated
associatedwith
withaacontinuous
continuousrandom
randomvariable
variableXXare
aredetermined
determinedbybythe
the
probabilitydensity
probability densityfunction
functionofofthe
therandom
randomvariable.
variable. The
Thefunction,
function,denoted
denotedf(x),
f(x),has
hasthe
the
followingproperties.
following properties.

1.1. f(x)00for
f(x) forall
allx.x.
2.2. Theprobability
The probabilitythat thatXXwill
willbe
bebetween
betweentwo twonumbers
numbersaaand andbbisisequal
equaltotothe
thearea
area
underf(x)
under f(x) between
betweenaaand andb.b.
3.3. Thetotal
The totalarea
areaunder
underthe thecurve
curveof
off(x)
f(x)isisequal
equaltoto1.00.
1.00.

Thecumulative
The cumulativedistribution
distributionfunction
functionofofaacontinuous
continuousrandom
randomvariable:
variable:

F(x) P(Xx)x)=Area
F(x)==P(X =Areaunder
underf(x)
f(x)between
betweenthe
thesmallest
smallestpossible
possiblevalue
valueof
ofXX(often
(often-)
-)and
and
thepoint
the pointx.x.
Probability Density Function and
Cumulative Distribution Function
F(x)

F(b)

F(a)
} P(a X b)=F(b) - F(a)

0
a b x
f(x)

P(a X b) = Area under


f(x) between a and b
= F(b) - F(a)

x
0 a b
Normal distribution -
Introduction
As n increases, the binomial distribution approaches a ...
n=6 n = 10 n = 14
Binomial Distribution: n=6, p=.5 Binomial Distribution: n=10, p=.5 Binomial Distribution: n=14, p=.5

0.3 0.3 0.3

0.2 0.2 0.2


P(x)

P(x)

P(x)
0.1 0.1 0.1

0.0 0.0 0.0


0 1 2 3 4 5 6 0 1 2 3 4 5 6 7 8 9 10 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14
x x x

Normal Distribution:  = 0, = 1


0.4

0.3
f(x)

0.2

0.1

0.0
-5 0 5
x
The Normal Probability
Distribution

The normal probability density function: Normal Distribution:  = 0, = 1

0.4


x 2


  0.3
 
 

f ( x)  1 e 2 2 for    x  

f(x)
0.2

2 2 0.1

where e  2.7182818... and   314159265


. ... 0.0
-5 0 5
x
The Normal Probability
Distribution

• The normal is a family of

Bell-shaped and symmetric distributions. because


the distribution is symmetric, one-half (.50 or 50%)
lies on either side of the mean.
Each is characterized by a different pair of mean,
, and variance, . That is: [X~N()].
Each is asymptotic to the horizontal axis.
Normal Probability
Distributions
All of these are normal probability density functions, though each has a different mean and variance.
Normal Distribution:  =40, =1 Normal Distribution:  =30, =5 Normal Distribution:  =50, =3
0.4 0.2 0.2

0.3
f(w)

f(x)

f(y)
0.2 0.1 0.1

0.1

0.0 0.0 0.0


35 40 45 0 10 20 30 40 50 60 35 45 50 55 65
w x y

W~N(40,1) X~N(30,25) Y~N(50,9)


Normal Distribution:  =0, =1
0.4 Consider:
0.3
The probability in each
P(39 W 41) case is an area under a
f(z)

0.2

0.1 P(25 X 35) normal probability density


0.0 P(47 Y 53) function.
P(-1 Z 1)
-5 0 5
z

Z~N(0,1)
The Standard Normal
Distribution
The standard normal random variable, Z, is the normal random
variable with mean = 0 and standard deviation = 1: Z~N(0,12).

Standard Normal Distribution

0 .4

0 .3

=1
f( z)

{
0 .2

0 .1

0 .0

-5 -4 -3 -2 -1 0 1 2 3 4 5

=0
Z
Finding Probabilities of the Standard
Normal Distribution: P(0 ≤ Z ≤ 1.56)
Standard Normal Probabilities
Standard Normal Distribution z .00 .01 .02 .03 .04 .05 .06 .07 .08 .09
0.0 0.0000 0.0040 0.0080 0.0120 0.0160 0.0199 0.0239 0.0279 0.0319 0.0359
0.4 0.1 0.0398 0.0438 0.0478 0.0517 0.0557 0.0596 0.0636 0.0675 0.0714 0.0753
0.2 0.0793 0.0832 0.0871 0.0910 0.0948 0.0987 0.1026 0.1064 0.1103 0.1141
0.3 0.1179 0.1217 0.1255 0.1293 0.1331 0.1368 0.1406 0.1443 0.1480 0.1517
0.3 0.4 0.1554 0.1591 0.1628 0.1664 0.1700 0.1736 0.1772 0.1808 0.1844 0.1879
0.5 0.1915 0.1950 0.1985 0.2019 0.2054 0.2088 0.2123 0.2157 0.2190 0.2224
0.6 0.2257 0.2291 0.2324 0.2357 0.2389 0.2422 0.2454 0.2486 0.2517 0.2549
f(z)

0.2 0.7 0.2580 0.2611 0.2642 0.2673 0.2704 0.2734 0.2764 0.2794 0.2823 0.2852
0.8 0.2881 0.2910 0.2939 0.2967 0.2995 0.3023 0.3051 0.3078 0.3106 0.3133
0.9 0.3159 0.3186 0.3212 0.3238 0.3264 0.3289 0.3315 0.3340 0.3365 0.3389
0.1
1.56 1.0
1.1
0.3413
0.3643
0.3438
0.3665
0.3461
0.3686
0.3485
0.3708
0.3508
0.3729
0.3531
0.3749
0.3554
0.3770
0.3577
0.3790
0.3599
0.3810
0.3621
0.3830
{

1.2 0.3849 0.3869 0.3888 0.3907 0.3925 0.3944 0.3962 0.3980 0.3997 0.4015
0.0 1.3 0.4032 0.4049 0.4066 0.4082 0.4099 0.4115 0.4131 0.4147 0.4162 0.4177
-5 -4 -3 -2 -1 0 1 2 3 4 5 1.4 0.4192 0.4207 0.4222 0.4236 0.4251 0.4265 0.4279 0.4292 0.4306 0.4319
Z 1.5 0.4332 0.4345 0.4357 0.4370 0.4382 0.4394 0.4406 0.4418 0.4429 0.4441
1.6 0.4452 0.4463 0.4474 0.4484 0.4495 0.4505 0.4515 0.4525 0.4535 0.4545
1.7 0.4554 0.4564 0.4573 0.4582 0.4591 0.4599 0.4608 0.4616 0.4625 0.4633
1.8 0.4641 0.4649 0.4656 0.4664 0.4671 0.4678 0.4686 0.4693 0.4699 0.4706
1.9 0.4713 0.4719 0.4726 0.4732 0.4738 0.4744 0.4750 0.4756 0.4761 0.4767
2.0 0.4772 0.4778 0.4783 0.4788 0.4793 0.4798 0.4803 0.4808 0.4812 0.4817

Look in row labeled 1.5 2.1


2.2
0.4821
0.4861
0.4826
0.4864
0.4830
0.4868
0.4834
0.4871
0.4838
0.4875
0.4842
0.4878
0.4846
0.4881
0.4850
0.4884
0.4854
0.4887
0.4857
0.4890
2.3 0.4893 0.4896 0.4898 0.4901 0.4904 0.4906 0.4909 0.4911 0.4913 0.4916
and column labeled .06 2.4
2.5
0.4918
0.4938
0.4920
0.4940
0.4922
0.4941
0.4925
0.4943
0.4927
0.4945
0.4929
0.4946
0.4931
0.4948
0.4932
0.4949
0.4934
0.4951
0.4936
0.4952

to find 2.6
2.7
0.4953
0.4965
0.4955
0.4966
0.4956
0.4967
0.4957
0.4968
0.4959
0.4969
0.4960
0.4970
0.4961
0.4971
0.4962
0.4972
0.4963
0.4973
0.4964
0.4974
2.8 0.4974 0.4975 0.4976 0.4977 0.4977 0.4978 0.4979 0.4979 0.4980 0.4981
P(0 z 1.56) = .4406 2.9
3.0
0.4981
0.4987
0.4982
0.4987
0.4982
0.4987
0.4983
0.4988
0.4984
0.4988
0.4984
0.4989
0.4985
0.4989
0.4985
0.4989
0.4986
0.4990
0.4986
0.4990
Finding Probabilities of the Standard
Normal Distribution: P(Z < -2.47)
To find P(Z<-2.47):
Find table area for 2.47
P(0 < Z < 2.47) = .4932
P(Z < -2.47) = .5 - P(0 < Z < 2.47)
= .5 - .4932 = 0.0068

Standard Normal Distribution


Area to the left of -2.47
0.4
P(Z < -2.47) = .5 - 0.4932
= 0.0068 0.3 Table area for 2.47
P(0 < Z < 2.47) = 0.4932
f(z)

0.2

0.1

0.0
-5 -4 -3 -2 -1 0 1 2 3 4 5
Z
Finding Probabilities of the Standard
Normal Distribution: P(1 ≤ Z ≤ 2)
To find P(1 Z 2):
1. Find table area for 2.00
F(2) = P(Z 2.00) = .5 + .4772 =.9772
2. Find table area for 1.00
F(1) = P(Z 1.00) = .5 + .3413 = .8413
3. P(1 Z 2.00) = P(Z 2.00) - P(Z 1.00)
= .9772 - .8413 = .1359

Standard Normal Distribution


0.4

0.3
Area between 1 and 2
P(1 Z 2) = .4772 - .8413 = 0.1359
f(z)

0.2

0.1

0.0
-5 -4 -3 -2 -1 0 1 2 3 4 5
Z
The Transformation of Normal
Random Variables
Thearea
The withinkk ofofthe
areawithin themean
meanisisthe
thesame
samefor forall
allnormal
normalrandom
randomvariables.
variables.SoSoananarea
area
underany
under anynormal
normaldistribution
distributionisisequivalent
equivalenttotoan
anarea
areaunder
underthe
thestandard
standardnormal.
normal. InInthis
this
example:P(40
example: XP(-1
P(40X P(-1ZZsinceand
sinceand

Thetransformation
The transformationof
ofXXtotoZ:Z:
X x Normal Distribution:  =50, =10
Z
x 0.07

0.06

Transformation 0.05
0.04

f(x)
(1) Subtraction: (X - x) 0.03
0.02 =10

{
Standard Normal Distribution 0.01

0.4 0.00
0 10 20 30 40 50 60 70 80 90 100
X
0.3
f(z)

0.2

(2) Division by x) Theinverse


inversetransformation
transformationof
ofZZtotoX:
X:
The
{

0.1 1.0

0.0
-5 -4 -3 -2 -1 0 1 2 3 4 5
X  x  Z x
Z
Using the Normal
Transformation

Example 4-1 Example 4-2


X~N(160,302) X~N(127,222)
P (100  X  180 ) P ( X  150)

 P
 100    X    180     X    150   
  P 
        
 100  160 180  160 
 P Z 
150  127 
 P Z  
 30 30   22 

 P 2  Z  .6667 
 P Z  1.045 
 0.4772  0.2475  0.7247  0.5  0.3520  0.8520
The Transformation of Normal
Random Variables
The transformation of X to Z: The inverse transformation of Z to X:
X  x X    Z
Z  x x
x

The transformation of X to Z, where a and b are numbers::


 a  
P( X  a)  P Z  
  
 b  
P( X  b)  P Z  
  
a  b  
P(a  X  b)  P Z 
   

You might also like