You are on page 1of 19

CHAPTER 3: RANDOM VARIABLES

3.1 Introduction

Random variable assigns a numerical value to each


outcome of a random experiment.

Random Variable Possible


values
X= Number of defectives in a random
sample of two computer chips drawn X=0, 1, 2
from a shipment
X= the waiting time until the first
customer arrives at a checkout counter X 0,8
(the counter will be open for 8 hours).

The probability distribution of a random variable X tells


us what the possible values of X are and how probabilities
are assigned to those values.

Types of Random Variables:

Discrete: possible values are isolated points on the number


line (number of defectives in a sample).

Continuous: possible values lie in an interval (waiting


time, time, temperature, pressure).

1
3.2 Discrete Random Variables

Probability Mass Function

Probability mass function of a discrete random variable X


is the function p(x) = P(X=x). The probability mass
function is also called the probability distribution.

The probability mass function (probability distribution) of a


discrete random variable is specified by a table:

x x1 x2 ... xn
P(X=x) p1 p2 ... pn

Here xi are the possible values of X and pi the


corresponding probabilities (sum of pi values equal to 1).

Example 1: Number of defectives

10% of computer chips in a very large shipment are


defective. Denote by X the number of defective chips in a
random sample of two. Find the probability distribution of
X. What is P(X1)?

Solution:

2
Graph of the probability mass function:

0.81

0.18

0.01
0 1 2 x

The heights of the three vertical line segments represent


probabilities assigned to the three possible values of X. The
total probability is equal to 1.

Cumulative Distribution Function

The cumulative distribution function F(x) of a discrete


random variable X is

F ( x) P( X x) Sum of probabilities pi for xi x

p1 p2 p3 pn
x1 x2 x3 . x xn

3
Example 2: Obtain the cumulative distribution function
F(x) for the random variable X from the previous example.
Solution:

Mean and Variance

Example: X=number of calls taken by a switchboard


within 1 minute

X 0 1 2 3 4
Fraction 0.31 0.39 0.18 0.11 0.01

Average number of calls =?

4
The mean or expected value of a discrete random variable
X, denoted as E(X) or is

E ( X ) pi xi ,
i
The expected value E(X) is a measure of the average value
taken by X.

The variance of X, denoted as Var(X) or 2, is

Var ( X ) ( xi ) 2 pi .
i

The variance Var(X) measures the spread of X about its


mean value. The standard deviation of X is .
2

Equivalent Formula:

Var ( X ) xi2 pi 2 .
i

Example 3: Calculate the mean and standard deviation of


X=the number of defectives in a random sample of two.

5
Example 4: Suppose that someone offers you bet of
flipping two fair coins. If two heads are obtained, you win
$10, otherwise you must pay $6. What are your expected
winnings?

6
3.3 Continuous Random Variables

Probability Distributions for Continuous Variables

Example: X = Weight of women in a population

Area=fraction of women of
weight between 129 and 130
Relative
Freq.
Total area of all rectangles = 1

129 130 X

Area=fraction of women
of weight smaller than 130
Relative
Freq.

129 130 X

distribution of X approximated
by a smooth curve (density
curve)

Relative
Freq.

7
Density curve describes
the overall pattern of a
Relative distribution
Freq.
Total area under the
curve = 1

Relative
Freq.

Area = proportion of
observations between a and b

Probability distribution of a continuous random variable


is specified by a function f(x) called, the density curve,
such that

(a) f ( x) 0,

(b)

f ( x)dx 1,

b
(c) P(a X b) f ( x)dx.
a

8
f (x)
P(aXb)

a b

P(aXb) = Probability that the values of X are between


a and b = Area under the density curve of X between a
and b.

The area under the entire graph of f(x) is 1.

P(X=x) = 0 for any value x of X.

Example 5: Shapes of Density Curves

Density curve of a right


skewed distribution

Density curve of a left


skewed distribution

9
Density curve of a
uniform distribution


0 1

uniformly distributed between 0 and 1

Example 6: Suppose that f(x) = kx2 for 0<x<1 and


f(x)=0 otherwise.
(a) Determine the constant k so that f (x) is a
probability density of X.

(b) Calculate P(0<X<0.5)

(c) Determine x such that P(X<x)=0.5

10
The cumulative distribution function F(x) of a
continuous random variable X is
x
F ( x) P( X x)

f (t )dt.

for -<x<.
f (x)

F(x) = Area to the left of x

F(x) tends to 1 as x tends to , F(x) increasing function.

Conclusions:

1. The density function f(x) is the derivative of F(x):

dF ( x)
f ( x) ,
dx

2. The probability

P(aXb) = F(b)-F(a).

11
P(aXb)

a b

F(b)

F(a)

Example 7: Obtain the cumulative distribution function


for the density function specified in the previous
example. Obtain P(X<0.5) and P(0.25 < X < 0.75) using
the cumulative distribution function.
Solution:

12
Mean and Variance
The mean or expected value of X, denoted as E(X) or is

E( X ) x f ( x)dx.

The variance of X, denoted as Var(X) or 2 is


2 Var ( X ) 2
( x ) f ( x)dx.

The standard deviation of X is .


2

Equivalent Formula:

2

x 2 f ( x)dx 2 .

Example 8: A random variable X follows a uniform


distribution in the interval (0, 1). Obtain the mean and the
standard deviation of X.
Solution:

13
Example 9: Consider f(x)=3x2 for 0<x<1 and f(x)=0
otherwise. Obtain the mean and the variance of X.
Solution:

14
Median and Percentiles

The median of X is the point x such as

F ( x) P( X x) 0.5.

x
Area = 0.50

If p is any number between 0 and 1, the 100pth percentile


is the point x such as

F ( x) P( X x ) p.

f (x)

x
Area = p

15
The median is the 50th percentile. The lower quartile (Q1 ) is
the 25th percentile and the upper quartile (Q3) is the 75th
percentile. The interquartile range IQR= Q3 Q1.

Example 10: Obtain the median and the interquartile range


for the random variable X with the density function
f(x)=3x2 for 0<x<1 and f(x)=0 otherwise. What is the 10th
percentile?

Solution:

16
3.5 Mean and Variance of a Linear Combination

Linear Function of a Random Variable



X a aX

Multiply all values of X by a constant a

If X is a random variable, and a and b are constants, then

E (a X b) a E ( X ) b,
Var (a X b) a 2 Var ( X ).

Conclusion: Suppose the random variable X has the


mean and the standard deviation . Then the random
variable
X
Y ,

has the mean of 0 and the standard deviation of 1.

Two random variables are said to be independent if the


value taken by one is not related to the value taken by the
other random variable.

17
Sums of Random Variables


1 X1 2 X2 1 2 X1+X2

Add all values of X1 to all values of X2

If X1 , X2 are random variables with the means E(X1)


and E(X2 ) then

E(X1 +X2)= E(X1) +E(X2).

If X1 , X2 are independent random variables,

Var(X1 +X2)= Var(X1) +Var(X2).

Linear Combination of Random Variables

If X1 , X2 , , Xn are random variables with the means


E(X1), E(X2), , E(Xn) then

E(a1X1 +a2X2++ anXn+b)= a1E(X1) +a2E(X2)++


+anE(Xn)+b,

18
for any constants a1 , a2 , , an and b.

If X1 , X2 , , Xn are independent random variables with


the variances Var(X1), Var(X2), , Var(Xn) then

Var(a1X1 +a2 X2 +...+ an Xn +b)= a12 Var(X1 )+a22 Var(X2 )+...a 2n Var(Xn ).

for any constants a1 , a2 , , an and b.

Example 11: Suppose the weights of people who use an


elevator follow a random variable with the mean of 69 kg
and standard deviation 9 kg.

(a) Find the mean of the total weight of 30 randomly


selected people in the elevator.

(b) Find the standard deviation of the total weight of 30


randomly selected people in the elevator.

19

You might also like