You are on page 1of 28

93

Chapter 5: Multivariate Probability Distributions



5.1 a. The sample space S gives the possible values for Y
1
and Y
2
:

S AA AB AC BA BB BC CA CB CC
(y
1
, y
2
) (2, 0) (1, 1) (1, 0) (1, 1) (0, 2) (1, 0) (1, 0) (0, 1) (0, 0)

Since each sample point is equally likely with probably 1/9, the joint distribution for Y
1

and Y
2
is given by
y
1

0 1 2
0 1/9 2/9 1/9
y
2
1 2/9 2/9 0
2 1/9 0 0

b. F(1, 0) = p(0, 0) + p(1, 0) = 1/9 + 2/9 = 3/9 = 1/3.


5.2 a. The sample space for the toss of three balanced coins w/ probabilities are below:

Outcome HHH HHT HTH HTT THH THT TTH TTT
(y
1
, y
2
) (3, 1) (3, 1) (2, 1) (1, 1) (2, 2) (1, 2) (1, 3) (0, 1)
probability 1/8 1/8 1/8 1/8 1/8 1/8 1/8 1/8

y
1

0 1 2 3
1 1/8 0 0 0
y
2
1 0 1/8 2/8 1/8
2 0 1/8 1/8 0
3 0 1/8 0 0

b. F(2, 1) = p(0, 1) + p(1, 1) + p(2, 1) = 1/2.

5.3 Note that using material from Chapter 3, the joint probability function is given by
p(y
1
, y
2
) = P(Y
1
= y
1
, Y
2
= y
2
) =

3
9
3
2 3 4
2 1 2 1
y y y y
, where 0 y
1
, 0 y
2
, and y
1
+ y
2
3.
In table format, this is

y
1

0 1 2 3
0 0 3/84 6/84 1/84
y
2
1 4/84 24/84 12/84 0
2 12/84 18/84 0 0
3 4/84 0 0 0

94 Chapter 5: Multivariate Probability Distributions
Instructors Solutions Manual

5.4 a. All of the probabilities are at least 0 and sum to 1.
b. F(1, 2) = P(Y
1
1, Y
2
2) = 1. Every child in the experiment either survived or didnt
and used either 0, 1, or 2 seatbelts.

5.5 a. 1065 . 3 ) 3 / 1 , 2 / 1 (
2 / 1
0
3 / 1
0
2 1 1 2 1
= =

dy dy y Y Y P .
b. 5 . 3 ) 2 / (
1
0
2 /
0
2 1 1 1 2
1
= =

y
dy dy y Y Y P .

5.6 a. [ ] . 125 . ) 5 (. 1 ) 5 . ( ) 5 . (
5 .
0
2 2 2
5 .
0
5 .
0
1
5 . 1
1
5 .
2 1 2 1 2 1
2
2
= = = = + > = >
+
+
dy y dy y dy dy Y Y P Y Y P
y
y

b.

= = > = > = <
1
5 .
2 2
1
5 .
1
/ 5 .
2 1 2 1 2 1 2 1
) / 5 . 1 ( 1 1 1 ) / 5 . ( 1 ) 5 . ( 1 ) 5 . (
2
dy y dy dy Y Y P Y Y P Y Y P
y

= 1 [.5 + .5ln(.5)] = .8466.

5.7 a. [ ] . 00426 . 1 ) 5 , 1 (
5 1
2
5
1
0
1
1
0
2 1
5
) (
2 1
2 1 2 1
= =

= = > <

+

e e dy e dy e dy dy e Y Y P
y y y y

b. . 8009 . 4 1 ) 3 ( ) 3 (
3
3
0
2 1
3
0
) (
2 1 2 1
2
2 1
= = = < = < +

+

e dy dy e Y Y P Y Y P
y
y y


5.8 a. Since the density must integrate to 1, evaluate 1 4 /
2 1
1
0
1
0
2 1
= =

k dy dy y ky , so k = 4.
b.
2
2
2
1 2 1
0 0
2 1 2 2 1 1 2 1
2 1
4 ) , ( ) , ( y y dt dt t t y Y y Y P y y F
y y
= = =

, 0 y
1
1, 0 y
2
1.

c. P(Y
1
1/2, Y
2
3/4) = (1/2)
2
(3/4)
2
= 9/64.

5.9 a. Since the density must integrate to 1, evaluate 1 6 / ) 1 (
2 1
1
0 0
2
2
= =

k dy dy y k
y
, so k = 6.
b. Note that since Y
1
Y
2
, the probability must be found in two parts (drawing a picture is
useful):
P(Y
1
3/4, Y
2
1/2) =

+
4 / 3
2 / 1
1
1 2 2 2 1
1
2 / 1
1
2 / 1
2
1
) 1 ( 6 ) 1 ( 6
y
dy dy y dy dy y =24/64 + 7/64 = 31/64.

5.10 a. Geometrically, since Y
1
and Y
2
are distributed uniformly over the triangular region,
using the area formula for a triangle k = 1.

b. This probability can also be calculated using geometric considerations. The area of the
triangle specified by Y
1
3Y
2
is 2/3, so this is the probability.
Chapter 5: Multivariate Probability Distributions 95
Instructors Solutions Manual

5.11 The area of the triangular region is 1, so with a uniform distribution this is the value of
the density function. Again, using geometry (drawing a picture is again useful):

a. P(Y
1
3/4, Y
2
3/4) = 1 P(Y
1
> 3/4) P(Y
2
> 3/4) = 1 ( )( ) ( )( )
32
29
4
1
4
1
2
1
4
1
2
1
2
1
= .
b. P(Y
1
Y
2
0) = P(Y
1
Y
2
). The region specified in this probability statement
represents 1/4 of the total region of support, so P(Y
1
Y
2
) = 1/4.

5.12 Similar to Ex. 5.11:
a. P(Y
1
3/4, Y
2
3/4) = 1 P(Y
1
> 3/4) P(Y
2
> 3/4) = 1 ( )( ) ( )( )
8
7
4
1
4
1
2
1
4
1
4
1
2
1
= .
b. . 2 / 1 2 ) 2 / 1 , 2 / 1 (
2 / 1
0
2 / 1
0
2 1 2 1
= =

dy dy Y Y P
5.13 a.
16
9
30 ) 2 / 1 , 2 / 1 (
1 2
2 / 1
0
2 / 1
1
2
2 1
1
= =

dy dy y y F
y
.

b. Note that:
) 2 / 1 , 2 / 1 ( ) 2 / 1 , 2 / 1 ( ) 1 , 2 / 1 ( ) 1 , 2 / 1 ( ) 2 , 2 / 1 (
2 1 2 1 2 1
> + = = = Y Y P Y Y P Y Y P F F
So, the first probability statement is simply ) 2 / 1 , 2 / 1 ( F from part a. The second
probability statement is found by
16
4
30 ) 2 / 1 , 2 / 1 (
2
1
2 / 1
1
0
2
2 1 2 1
2
= = >

dy dy y y Y Y P
y
.
Thus,
16
13
16
4
16
9
) 2 , 2 / 1 ( = + = F .

c. . 65625 .
32
21
32
11
1 30 1 ) ( 1 ) (
1 2
2 / 1
0
1
2
2 1 2 1 2 1
1
1
= = = = = >

dy dy y y Y Y P Y Y P
y
y


5.14 a. Since 0 ) , (
2 1
y y f , simply show

=
1
0
2
1 2 2
2
1
1 6
1
1
y
y
dy dy y y .
b.

= = < = < +
5 .
0
1
1 2 2
2
1 1 2 2 1
16 / 1 6 ) 1 ( ) 1 (
1
1
y
y
dy dy y y Y Y P Y Y P .

5.15 a.
2 1
2
1
2
2 1
2
1 1
1 2 2 1
2 ) 1 , 2 (
2
1
1
1

= = = > <

e e dy dy e dy dy e Y Y P
y
y
y
y
.
b. 2 / 1 ) 2 (
0 2
2 1 2 1
2
1
= =

y
y
dy dy e Y Y P .
c.
1
0 1
2 1 2 1 2 1
2
1
) 1 ( ) 1 (


+

= = + =

e dy dy e Y Y P Y Y P
y
y
.

96 Chapter 5: Multivariate Probability Distributions
Instructors Solutions Manual

5.16 a.
2 1
1
4 / 1
2 / 1
0
2 1 2 1
) ( ) 4 / 1 , 2 / 1 ( dy dy y y Y Y P

+ = > < = 21/64 = .328125.
b. 3 / 1 ) ( ) 1 ( ) 1 (
2 1
1
0
1
0
2 1 2 1 2 1
2
= + = = +

dy dy y y Y Y P Y Y P
y
.

5.17 This can be found using integration (polar coordinates are helpful). But, note that this is
a bivariate uniform distribution over a circle of radius 1, and the probability of interest
represents 50% of the support. Thus, the probability is .50.

5.18 ( )
1
2
3
2
3
2
2 /
1
2
1
1
2 /
1
1
4
1
2 1
2 / ) (
1
1 1
8
1
2 1
2
1
2
1
2 1 2 1
) 1 , 1 (

+

= =

= = > >

e e e dy e dy e y dy dy e y Y Y P
y y y y

5.19 a. The marginal probability function is given in the table below.

y
1
0 1 2
p
1
(y
1
) 4/9 4/9 1/9

b. No, evaluating binomial probabilities with n = 3, p = 1/3 yields the same result.


5.20 a. The marginal probability function is given in the table below.

y
2
1 1 2 3
p
2
(y
2
) 1/8 4/8 2/8 1/8

b. 4 / 1 ) 1 | 3 (
8 / 4
8 / 1
) 1 (
) 1 , 3 (
2 1
2
2 1
= = = = =
=
= =
Y P
Y Y P
Y Y P .

5.21 a. The marginal distribution of Y
1
is hypergeometric with N = 9, n = 3, and r = 4.

b. Similar to part a, the marginal distribution of Y
2
is hypergeometric with N = 9, n = 3,
and r = 3. Thus,
3 / 2 ) 2 | 1 (
3
9
1
6
2
3
3
9
0
2
2
3
1
4
) 2 (
) 2 , 1 (
2 1
2
2 1
= = = = =

=
= =
Y P
Y Y P
Y Y P .
c. Similar to part b,
15 / 8 ) 1 | 1 ( ) 1 | 1 (
3
9
2
6
1
3
3
9
1
4
1
2
1
3
) 1 (
) 1 , 1 (
2 1 2 3
2
2 1
= = = = = = = =

=
= =
Y P
Y Y P
Y Y P Y Y P .

5.22 a. The marginal distributions for Y
1
and Y
2
are given in the margins of the table.
b. P(Y
2
= 0 | Y
1
= 0) = .38/.76 = .5 P(Y
2
= 1 | Y
1
= 0) = .14/.76 = .18
P(Y
2
= 2 | Y
1
= 0) = .24/.76 = .32
c. The desired probability is P(Y
1
= 0 | Y
2
= 0) = .38/.55 = .69.
Chapter 5: Multivariate Probability Distributions 97
Instructors Solutions Manual

5.23 a. 1 0 , 3 ) (
2
2
2 2
3
2
3
1
1
1 2 2
2
= =

y y dy y y f
y
.
b. Defined over y
2
y
1
1, with the constant y
2
0.

c. First, we have 1 0 , 3 3 ) (
1
2
2 2
0
1 1 1
1
= =

y y dy y y f
y
. Thus,
1 2 1 1 2
0 , / 1 ) | ( y y y y y f = . So, conditioned on Y
1
= y
1
, we see Y
2
has a uniform
distribution on the interval (0, y
1
). Therefore, the probability is simple:
P(Y
2
> 1/2 | Y
1
= 3/4) = (3/4 1/2)/(3/4) = 1/3.

5.24 a. 1 0 , 1 ) (
1 1 1
= y y f , 1 0 , 1 ) (
2 2 2
= y y f .
b. Since both Y
1
and Y
2
are uniformly distributed over the interval (0, 1), the probabilities
are the same: .2
c. 1 0
2
y .
d. 1 0 , 1 ) ( ) | (
1 1 2 1
= = y y f y y f
e. P(.3 < Y
1
< .5 | Y
2
= .3) = .2
f. P(.3 < Y
2
< .5 | Y
2
= .5) = .2
g. The answers are the same.

5.25 a. 0 , ) (
1 1 1
1
> =

y e y f
y
, 0 , ) (
2 2 2
2
> =

y e y f
y
. These are both exponential density
functions with = 1.
b. . 2858 . ) 5 . 2 1 ( ) 5 . 2 1 (
5 . 2 1
2 1
= = < < = < <

e e Y P Y P
c. y
2
> 0.
d. 0 , ) ( ) | (
1 1 1 2 1
1
> = =

y e y f y y f
y
.
e. 0 , ) ( ) | (
2 2 2 1 2
2
> = =

y e y f y y f
y
.
f. The answers are the same.
g. The probabilities are the same.

5.26 a. 1 0 , 2 ) ( ; 1 0 , 2 4 ) (
2 2 2 1 1
1
0
2 2 1 1 1
= = =

y y y f y y dy y y y f .
b. 4 / 1 2
2
4
) 4 / 3 | 2 / 1 (
2 / 1
0
1 1 1
4 / 3
2 2
2 / 1
0
1
4 / 3
2 1 2 1
2 1
= = =


dy y
dy y
dy dy y y
Y Y P .
c. 1 0 , 2 ) ( ) | (
1 1 1 1 2 1
= = y y y f y y f .
d. 1 0 , 2 ) ( ) | (
2 2 2 2 1 2
= = y y y f y y f .
e. 16 / 9 2 ) 4 / 3 ( ) 2 / 1 | 4 / 3 (
4 / 3
0
1 1 1 2 1
= = = =

dy y Y P Y Y P .


98 Chapter 5: Multivariate Probability Distributions
Instructors Solutions Manual

5.27 a. ; 1 0 , ) 1 ( 3 ) 1 ( 6 ) (
1
2
1
1
2 2 1 1
1
= =

y y dy y y f
y

1 0 ), 1 ( 6 ) 1 ( 6 ) (
2 2 2
0
1 2 2 2
2
= =

y y y dy y y f
y
.
b. . 63 / 32
) 1 ( 3
) 1 ( 6
) 4 / 3 | 2 / 1 (
4 / 3
0
1
2
1
2 / 1
0 0
2 1 2
1 2
2
=


dy y
dy dy y
Y Y P
y

c. 1 0 , / 1 ) | (
2 1 2 2 1
= y y y y y f .
d. 1 0 , ) 1 /( ) 1 ( 2 ) | (
2 1
2
1 2 1 2
= y y y y y y f .
e. From part d, 1 2 / 1 ), 1 ( 8 ) 2 / 1 | (
2 2 2
= y y y f . Thus, . 4 / 1 ) 2 / 1 | 4 / 3 (
1 2
= = Y Y P

5.28 Referring to Ex. 5.10:
a. First, find 1 0 ), 1 ( 2 1 ) (
2 2
2
2
1 2 2
2
= =

y y dy y f
y
. Then, 25 . ) 5 . (
2
= Y P .
b. First find . 2 2 , ) | (
1 2 ) 1 ( 2
1
2 1
2
=

y y y y f
y
Thus, 2 1 , 1 ) 5 . | (
1 1
= y y f the
conditional distribution is uniform on (1, 2). Therefore, 5 . ) 5 . | 5 . 1 (
2 1
= = Y Y P


5.29 Referring to Ex. 5.11:
a. 1 0 ), 1 ( 2 1 ) (
2 2
1
1
1 2 2
2
2
= =

y y dy y f
y
y
. In order to find f
1
(y
1
), notice that the limits of
integration are different for 0 y
1
1 and 1 y
1
0. For the first case:
1
1
0
2 1 1
1 1 ) (
1
y dy y f
y
= =

, for 0 y
1
1. For the second case,
1
1
0
2 1 1
1 1 ) (
1
y dy y f
y
+ = =

+
, for
1 y
1
0. This can be written as | | 1 ) (
1 1 1
y y f = , for 1 y
1
1.

b. The conditional distribution is
| | 1
1
1 2
1
) | (
y
y y f

= , for 0 y
1
1 |y
1
|. Thus,
3 / 4 ) 4 / 1 | (
2
= y f . Then,

= = >
4 / 3
2 / 1
2 1 2
3 / 4 ) 4 / 1 | 2 / 1 ( dy Y Y P = 1/3.

5.30 a.

= =
4 / 1
0
16
3
1
2 / 1
2 1 2 1
. 2 ) 4 / 1 , 2 / 1 (
2
y
dy dy Y Y P And,

= =
4 / 1
0
16
7
2 2 2
. ) 1 ( 2 ) 4 / 1 ( dy y Y P
Thus,
7
3
2 1
) 4 / 1 | 2 / 1 ( = Y Y P .
b. Note that
2 1 1
1
2 1
1 0 , ) | (
2
y y y y f
y
=

. Thus, 4 / 3 0 , 3 / 4 ) 4 / 1 | (
1 1
= y y f .
Thus,

= = >
4 / 3
2 / 1
2 1 2
3 / 4 ) 4 / 1 | 2 / 1 ( dy Y Y P = 1/3.
Chapter 5: Multivariate Probability Distributions 99
Instructors Solutions Manual

5.31 a. 1 0 , ) 1 ( 20 30 ) (
1
2
1 1 2
1
1
2
2 1 1 1
1
1
= =

y y y dy y y y f
y
y
.
b. This marginal density must be constructed in two parts:

=
+ =
=

+
1 0 ) 1 ( 5 30
0 1 ) 1 ( 15 30
) (
2 2
2
2
1
0
1
2
2 1
2 2
2
2
1
0
1
2
2 1
2 2
2
2
y y y dy y y
y y y dy y y
y f
y
y
.
c.
3
1
2
2 2
3
1 2
) 1 ( ) | (

= y y y y f , for y
1
1 y
2
1 y
1
.
d.
3 2
2 2
3
2
) 25 (. ) 75 . | (

= y y f , for .25 y
2
.25, so P(Y
2
> 0 | Y
1
= .75) = .5.

5.32 a. 1 0 ), 1 ( 12 6 ) (
1 1
2
1 2 2
2
2
1 1 1
1
1
= =

y y y dy y y y f
y
y
.
b. This marginal density must be constructed in two parts:

=
=
=

2 1 ) 2 ( 2 6
1 0 2 6
) (
2
3
2 2
2
0
1 2
2
1
2
4
2
0
1 2
2
1
2 2
2
2
y y y dy y y
y y dy y y
y f
y
y
.
c.
1 2 1 1 2 2
1
1 2
2 ), 1 /( ) | ( y y y y y y y f = .
d. Using
the density found in part c, 53 . 4 . / ) 6 . | 1 . 1 (
11
6 .
2 2 2
1
1 2
= = = <

dy y Y Y P
5.33 Refer to Ex. 5.15:
a. . 0 , ) (
1 1
0
2 1 1
1
1
1
= =

y e y dy e y f
y
y
y
. 0 , ) (
2 1 2 2
2
2
1
= =

y e dy e y f
y
y
y

b.
2 1
) (
2 1
, ) | (
2 1
y y e y y f
y y
=

.
c.
1 2 1 1 2
0 , / 1 ) | ( y y y y y f = .
d. The density functions are different.
e. The marginal and conditional probabilities can be different.


5.34 a. Given Y
1
= y
1
, Y
2
has a uniform distribution on the interval (0, y
1
).
b. Since f
1
(y
1
) = 1, 0 y
1
1, f

(y
1
, y
2
) = f

(y
2
| y
1
)f
1
(y
1
) = 1/y
1
, 0 y
2
y
1
1.
c.

= =
1
2 2 1 1 2 2
2
1 0 ), ln( / 1 ) (
y
y y dy y y f .

5.35 With Y
1
= 2, the conditional distribution of Y
2
is uniform on the interval (0, 2). Thus,
P(Y
2
< 1 | Y
1
= 2) = .5.

100 Chapter 5: Multivariate Probability Distributions
Instructors Solutions Manual

5.36 a.
2
1
1
1
0
2 2 1 1 1
) ( ) ( + = + =

y dy y y y f , 0 y
1
1. Similarly
2
1
2 2 2
) ( + = y y f , 0 y
2
1.
b. First,
8
5
2
1
1
2 / 1
2 2
1
2
) ( ) ( = + =

y Y P , and

= + =
1
2 / 1
8
3
2 1
1
2 / 1
2 1 2
1
2 2
1
1
) ( ) , ( dy dy y y Y Y P .
Thus,
5
3
2
1
2 2
1
1
) | ( = Y Y P .
c.
( )
. 34375 . ) 5 . | 75 . (
2
1
2
1
1
75 .
1 2
1
1
2 1
=
+
+
= = >

dy y
Y Y P

5.37 Calculate
2 /
2
1
1
2 / ) (
0
8 2 2
2 2 1 1
) (
y y y y
e dy e y f
+

= =

, y
2
> 0. Thus, Y
2
has an exponential
distribution with = 2 and P(Y
2
> 2) = 1 F(2) = e
1
.


5.38 This is the identical setup as in Ex. 5.34.
a. f

(y
1
, y
2
) = f

(y
2
| y
1
)f
1
(y
1
) = 1/y
1
, 0 y
2
y
1
1.

b. Note that f

(y
2
| 1/2) = 1/2, 0 y
2
1/2. Thus, P(Y
2
< 1/4 | Y
1
= 1/2) = 1/2.

c. The probability of interest is P(Y
1
> 1/2 | Y
2
= 1/4). So, the necessary conditional
density is f

(y
1
| y
2
) = f

(y
1
, y
2
)/f
2
(y
2
) =
) ln (
1
2 1
y y
, 0 y
2
y
1
1. Thus,
P(Y
1
> 1/2 | Y
2
= 1/4) =

1
2 / 1
1 4 ln
1
1
dy
y
= 1/2.

5.39 The result follows from:
) (
) , (
) (
) , (
) (
) , (
) | (
1 2 1 1 2 1 1 1 1 1
1 1
w W P
y w Y y Y P
w W P
w Y Y y Y P
w W P
w W y Y P
w W y Y P
=
= =
=
=
= + =
=
=
= =
= = = .
Since Y
1
and Y
2
are independent, this is
( )
!
) (
)! ( !
1 2 1 1
1 1 )
2 1
(
2 1
1
2 1
2
1
1 1
1
) (
) ( ) (
) | (
w
e
y w
e
y
e
w
y w y
w W P
y w Y P y Y P
w W y Y P
+

+


=
=
= =
= = =

=
1 1
2 1
1
2 1
1
1
1
y w y
y
w

.

This is the binomial distribution with n = w and p =
2 1
1
+

.




Chapter 5: Multivariate Probability Distributions 101
Instructors Solutions Manual

5.40 As the Ex. 5.39 above, the result follows from:
) (
) , (
) (
) , (
) (
) , (
) | (
1 2 1 1 2 1 1 1 1 1
1 1
w W P
y w Y y Y P
w W P
w Y Y y Y P
w W P
w W y Y P
w W y Y P
=
= =
=
=
= + =
=
=
= =
= = = .

Since Y
1
and Y
2
are independent, this is (all terms involving p
1
and p
2
drop out)
2 1
1 1
2 1
1
2
1
1
1 2 1 1
1 1
0
0
,
) (
) ( ) (
) | (
n y w
n y
w
n n
y w
n
y
n
w W P
y w Y P y Y P
w W y Y P

=
=
= =
= = = .
5.41 Let Y = # of defectives in a random selection of three items. Conditioned on p, we have
=

= =

y p p
y
p y Y P
y y
, ) 1 (
3
) | (
3
0, 1, 2, 3.
We are given that the proportion of defectives follows a uniform distribution on (0, 1), so
the unconditional probability that Y = 2 can be found by

= = = = = = =

1
0
3 2
1
0
1
0
1
0
1 3 2
) ( 3 ) 1 ( 3 ) ( ) | 2 ( ) , 2 ( ) 2 ( dp p p dp p p dp p f p Y P dp p Y P Y P
= 1/4.

5.42 (Similar to Ex. 5.41) Let Y = # of defects per yard. Then,
( )
1
2
1
0
!
0 0
) ( ) | ( ) , ( ) (
+


= = = = = =

y
y
e
d e d f y Y P d y Y P y p
y
, y = 0, 1, 2, .
Note that this is essentially a geometric distribution (see Ex. 3.88).

5.43 Assume ). ( ) | (
1 1 2 1
y f y y f = Then, ) ( ) ( ) ( ) | ( ) , (
2 2 1 1 2 2 2 1 2 1
y f y f y f y y f y y f = = so that
Y
1
and Y
2
are independent. Now assume that Y
1
and Y
2
are independent. Then, there
exists functions g and h such that ) ( ) ( ) , (
2 1 2 1
y h y g y y f = so that

= =
2 2 1 1 2 1 2 1
) ( ) ( ) , ( 1 dy y h dy y g dy dy y y f .
Then, the marginals for Y
1
and Y
2
can be defined by

=
1 1
1
2
2 2 1 1
2 1
1 1
) (
) (
) ( ) (
) ( ) (
) (
dy y g
y g
dy
dy y h dy y g
y h y g
y f , so

=
2 2
2
2 2
) (
) (
) (
dy y h
y h
y f .
Thus, ) ( ) ( ) , (
2 2 1 1 2 1
y f y f y y f = . Now it is clear that
) ( ) ( / ) ( ) ( ) ( / ) , ( ) | (
1 1 2 2 2 2 1 1 2 2 2 1 2 1
y f y f y f y f y f y y f y y f = = = ,
provided that ) (
2 2
y f > 0 as was to be shown.

5.44 The argument follows exactly as Ex. 5.43 with integrals replaced by sums and densities
replaced by probability mass functions.

5.45 No. Counterexample: P(Y
1
= 2, Y
2
= 2) = 0 P(Y
1
= 2)P(Y
2
= 2) = (1/9)(1/9).

5.46 No. Counterexample: P(Y
1
= 3, Y
2
= 1) = 1/8 P(Y
1
= 3)P(Y
2
= 1) = (1/8)(4/8).

102 Chapter 5: Multivariate Probability Distributions
Instructors Solutions Manual

5.47 Dependent. For example: P(Y
1
= 1, Y
2
= 2) P(Y
1
= 1)P(Y
2
= 2).

5.48 Dependent. For example: P(Y
1
= 0, Y
2
= 0) P(Y
1
= 0)P(Y
2
= 0).

5.49 Note that 1 0 , 3 3 ) (
1
2
1
0
2 1 1 1
1
= =

y y dy y y f
y
, 1 0 ], 1 [ 3 ) (
2
2
2 2
3
1
1 1 2 2
1
= =

y y dy y y f
y
.
Thus, ) ( ) ( ) , (
2 2 1 1 2 1
y f y f y y f so that Y
1
and Y
2
are dependent.

5.50 a. Note that 1 0 , 1 1 ) (
1
1
0
2 1 1
= =

y dy y f and 1 0 , 1 1 ) (
2
1
0
1 2 2
= =

y dy y f . Thus,
) ( ) ( ) , (
2 2 1 1 2 1
y f y f y y f = so that Y
1
and Y
2
are independent.

b. Yes, the conditional probabilities are the same as the marginal probabilities.

5.51 a. Note that 0 , ) (
1
0
2
) (
1 1
1 2 1
> = =

y e dy e y f
y y y
and 0 , ) (
2
0
1
) (
2 2
2 2 1
> = =

y e dy e y f
y y y
.
Thus, ) ( ) ( ) , (
2 2 1 1 2 1
y f y f y y f = so that Y
1
and Y
2
are independent.

b. Yes, the conditional probabilities are the same as the marginal probabilities.

5.52 Note that ) , (
2 1
y y f can be factored and the ranges of y
1
and y
2
do not depend on each
other so by Theorem 5.5 Y
1
and Y
2
are independent.

5.53 The ranges of y
1
and y
2
depend on each other so Y
1
and Y
2
cannot be independent.

5.54 The ranges of y
1
and y
2
depend on each other so Y
1
and Y
2
cannot be independent.

5.55 The ranges of y
1
and y
2
depend on each other so Y
1
and Y
2
cannot be independent.

5.56 The ranges of y
1
and y
2
depend on each other so Y
1
and Y
2
cannot be independent.

5.57 The ranges of y
1
and y
2
depend on each other so Y
1
and Y
2
cannot be independent.

5.58 Following Ex. 5.32, it is seen that ) ( ) ( ) , (
2 2 1 1 2 1
y f y f y y f so that Y
1
and Y
2
are
dependent.

5.59 The ranges of y
1
and y
2
depend on each other so Y
1
and Y
2
cannot be independent.

5.60 From Ex. 5.36,
2
1
1 1 1
) ( + = y y f , 0 y
1
1, and
2
1
2 2 2
) ( + = y y f , 0 y
2
1. But,
) ( ) ( ) , (
2 2 1 1 2 1
y f y f y y f so Y
1
and Y
2
are dependent.

5.61 Note that ) , (
2 1
y y f can be factored and the ranges of y
1
and y
2
do not depend on each
other so by Theorem 5.5, Y
1
and Y
2
are independent.
Chapter 5: Multivariate Probability Distributions 103
Instructors Solutions Manual

5.62 Let X, Y denote the number on which person A, B flips a head on the coin, respectively.
Then, X and Y are geometric random variables and the probability that the stop on the
same number toss is:
+ = = + = = = + = = + = = ) 2 ( ) 2 ( ) 1 ( ) 1 ( ) 2 , 2 ( ) 1 , 1 ( Y P X P Y P X P Y X P Y X P
=
2
2
0
2
1
2
1
1 1
) 1 ( 1
] ) 1 [( ) 1 ( ) 1 ( ) ( ) (
p
p
p p p p p p i Y P i X P
k
k
i i
i i

= = = = =


=

=

.

5.63
6
1
0 2 /
1 2
) (
2 1 2 1
1
1
2 1
) 2 , ( = = < >

+
y
y
y y
dy dy e Y Y Y Y P and
3
2
0 2 /
1 2
) (
2 1
1
2 1
) 2 ( = = <


+
y
y y
dy dy e Y Y P . So,
. 4 / 1 ) 2 | (
2 1 2 1
= < > Y Y Y Y P

5.64
4
1
1
0 2 /
1 2 2 1 2 1
1
1
1 ) 2 , ( = = < >

y
y
dy dy Y Y Y Y P ,
4
3
1
0
2 /
0
1 2 2 1 2 1
1
1 1 ) 2 ( 1 ) 2 ( = = = <

y
dy dy Y Y P Y Y P .
So, . 3 / 1 ) 2 | (
2 1 2 1
= < > Y Y Y Y P

5.65 a. The marginal density for Y
1
is
2
0
1 1
2 1 2 1
)] 2 1 )( 2 1 ( 1 [( ) ( dy e e e y f
y y y y


= . ) 2 ( ) 2 1 (
0
2
2
0
2
2 2 1 2 1


dy e e e dy e e
y y y y y

=
1 1 2 1
) 1 1 )( 2 1 (
0
2
y y y y
e e dy e e

,
which is the exponential density with a mean of 1.
b. By symmetry, the marginal density for Y
2
is also exponential with = 1.

c. When = 0, then ) ( ) ( ) , (
2 2 1 1 2 1
2 1
y f y f e y y f
y y
= =

and so Y
1
and Y
2
are independent.
Now, suppose Y
1
and Y
2
are independent. Then, E(Y
1
Y
2
) = E(Y
1
)E(Y
2
) = 1. So,

= ) (
2 1
Y Y E


0
2 1
0
2 1
2 1 2 1
)] 2 1 )( 2 1 ( 1 [( dy dy e e e y y
y y y y

=

0
2 2
0
1 1
0
2 1 2
0
1
2 2 1 1 2 1
) 2 1 ( ) 2 1 ( dy e e y dy e e y dy dy e y y
y y y y y y


= ( )( ) 4 / 1 1 1 1
2
1
2
1
= . This equals 1 only if = 0.

5.66 a. Since 1 ) (
2
= F , [ ] ) ( } 1 1 )}{ ( 1 { 1 1 ) ( ) , (
1 1 1 1 1 1 1
y F y F y F y F = = .
b. Similarly, it is ) (
2 2
y F from ) , (
2 1
y y F
c. If = 0, ) ( ) ( ) , (
2 2 1 1 2 1
y F y F y y F = , so by Definition 5.8 they are independent.
d. If 0, ) ( ) ( ) , (
2 2 1 1 2 1
y F y F y y F , so by Definition 5.8 they are not independent.
104 Chapter 5: Multivariate Probability Distributions
Instructors Solutions Manual

5.67 ) , ( ) , ( ) , ( ) , ( ) , (
2 1
c a F d a F c b F d b F d Y c b Y a P + = < <
) ( ) ( ) ( ) ( ) ( ) ( ) ( ) (
2 1 2 1 2 1 2 1
c F a F d F a F c F b F d F b F + =
[ ] [ ] ) ( ) ( ) ( ) ( ) ( ) (
2 2 1 2 2 1
c F d F a F c F d F b F =
[ ] [ ] ) ( ) ( ) ( ) (
2 2 1 1
c F d F a F b F =
) ( ) (
2 1
d Y c P b Y a P < < = .

5.68 Given that
1 1
2
1
1 1
) 8 (. ) 2 (.
2
) (
y y
y
y p

= , y
1
= 0, 1, 2, and
1 2
1
2 2
) 7 (. ) 3 (. ) (
y y
y p

= , y
2
= 0, 1:
a.
1 2 1 1
1 2
1
2 2 1 1 2 1
) 7 (. ) 3 (. ) 8 (. ) 2 (.
2
) ( ) ( ) , (
y y y y
y
y p y p y y p

= = , y
1
= 0, 1, 2 and y
2
= 0, 1.
b. The probability of interest is ) 1 , 0 ( ) 0 , 1 ( ) 0 , 0 ( ) 1 (
2 1
p p p Y Y P + + = + = .864.

5.69 a.
3 / ) (
2 2 1 1 2 1
2 1
) 9 / 1 ( ) ( ) ( ) , (
y y
e y f y f y y f
+
= = , y
1
> 0, y
2
> 0.
b.
3 / 1
1
0
3
4
1
0
2 1
3 / ) (
2 1
1 ) 9 / 1 ( ) 1 (
2
2 1

+

= = + e dy dy e Y Y P
y
y y
= .0446.

5.70 With 1 ) ( ) ( ) , (
2 2 1 1 2 1
= = y f y f y y f , 0 y
1
1, 0 y
2
1,
P(Y
2
Y
1
Y
2
+ 1/4) = 32 / 7 1 1
1
4 / 1 4 / 1
1 2
4 / 1
0 0
1 2
1
1
1
= +


y
y
y
dy dy dy dy .

5.71 Assume uniform distributions for the call times over the 1hour period. Then,
a. 4 / 1 ) 2 / 1 )( 2 / 1 ( ) 2 / 1 ( 2 / 1 ( ) 2 / 1 , 2 / 1 (
2 1 2 1
= = = Y P Y P Y Y P .
b. Note that 5 minutes = 1/12 hour. To find ) 12 / 1 | (|
2 1
Y Y P , we must break the
region into three parts in the integration:


+

+
+ + =
1
12 / 11
1
12 / 1
1 2
12 / 11
12 / 1
12 / 1
12 / 1
1 2
12 / 1
0
12 / 1
0
1 2 2 1
1
1
1
1
1 1 1 ) 12 / 1 | (|
y
y
y
y
dy dy dy dy dy dy Y Y P = 23/144.

5.72 a. E(Y
1
) = 2(1/3) = 2/3.
b. V(Y
1
) = 2(1/3)(2/3) = 4/9
c. E(Y
1
Y
2
) = E(Y
1
) E(Y
2
) = 0.

5.73 Use the mean of the hypergeometric: E(Y
1
) = 3(4)/9 = 4/3.

5.74 The marginal distributions for Y
1
and Y
2
are uniform on the interval (0, 1). And it was
found in Ex. 5.50 that Y
1
and Y
2
are independent. So:
a. E(Y
1
Y
2
) = E(Y
1
) E(Y
2
) = 0.
b. E(Y
1
Y
2
) = E(Y
1
)E(Y
2
) = (1/2)(1/2) = 1/4.
c. E(Y
1
2
+ Y
2
2
) = E(Y
1
2
) + E(Y
2
2
) = (1/12 + 1/4) + (1/12 + 1/4) = 2/3
d. V(Y
1
Y
2
) = V(Y
1
)V(Y
2
) = (1/12)(1/12) = 1/144.


Chapter 5: Multivariate Probability Distributions 105
Instructors Solutions Manual

5.75 The marginal distributions for Y
1
and Y
2
are exponential with = 1. And it was found in
Ex. 5.51 that Y
1
and Y
2
are independent. So:
a. E(Y
1
+ Y
2
) = E(Y
1
) + E(Y
2
) = 2, V(Y
1
+ Y
2
) = V(Y
1
) + V(Y
2
) = 2.
b.


+

= + > = >
0
2 1
3
2 1 2 1
2
2 1
) 3 ( ) 3 ( dy dy e Y Y P Y Y P
y
y y
=(1/2)e
3
= .0249.
c.


+

= > = <
0
1 2
3
2 1 2 1
1
2 1
) 3 ( ) 3 ( dy dy e Y Y P Y Y P
y
y y
=(1/2)e
3
= .0249.
d. E(Y
1
Y
2
) = E(Y
1
) E(Y
2
) = 0, V(Y
1
Y
2
) = V(Y
1
) + V(Y
2
) = 2.

e. They are equal.


5.76 From Ex. 5.52, we found that Y
1
and Y
2
are independent. So,
a. 3 / 2 2 ) (
1
0
1
2
1 1
= =

dy y Y E .
b. 4 / 2 2 ) (
1
0
1
3
1
2
1
= =

dy y Y E , so 18 / 1 9 / 4 4 / 2 ) (
1
= = Y V .
c. E(Y
1
Y
2
) = E(Y
1
) E(Y
2
) = 0.


5.77 Following Ex. 5.27, the marginal densities can be used:
a. 2 / 1 ) 1 ( 6 ) ( , 4 / 1 ) 1 ( 3 ) (
1
0
2 2 2 2
1
0
1
2
1 1 1
= = = =

dy y y Y E dy y y Y E .
b. 80 / 3 ) 4 / 1 ( 10 / 1 ) ( , 10 / 1 ) 1 ( 3 ) (
2
1
1
0
1
2
1
2
1
2
1
= = = =

Y V dy y y Y E ,
20 / 1 ) 2 / 1 ( 10 / 3 ) ( , 10 / 3 ) 1 ( 6 ) (
2
2
1
0
2 2
2
2
2
2
= = = =

Y V dy y y Y E .
c. E(Y
1
3Y
2
) = E(Y
1
) 3E(Y
2
) = 1/4 3/2 = 5/4.


5.78 a. The marginal distribution for Y
1
is f
1
(y
1
) = y
1
/2, 0 y
1
2. E(Y
1
) = 4/3, V(Y
1
) = 2/9.
b. Similarly, f
2
(y
2
) = 2(1 y
2
), 0 y
2
1. So, E(Y
2
) = 1/3, V(Y
1
) = 1/18.
c. E(Y
1
Y
2
) = E(Y
1
) E(Y
2
) = 4/3 1/3 = 1.
d. V(Y
1
Y
2
) = E[(Y
1
Y
2
)
2
] [E(Y
1
Y
2
)]
2
= E(Y
1
2
) 2E(Y
1
Y
2
) + E(Y
2
2
) 1.
Since E(Y
1
Y
2
) =

=
1
0
2
2
2 1 2 1
2 / 1
2
y
dy dy y y , we have that
V(Y
1
Y
2
) = [2/9 + (4/3)
2
] 1 + [1/18 + (1/3)
2
] 1 = 1/6.

Using Tchebysheffs theorem, two standard deviations about the mean is (.19, 1.81).


106 Chapter 5: Multivariate Probability Distributions
Instructors Solutions Manual

5.79 Referring to Ex. 5.16, integrating the joint density over the two regions of integration:
0 ) (
1
0
1
0
1 2 2 1
0
1
1
0
1 2 2 1 2 1
1 1
= + =

+ y y
dy dy y y dy dy y y Y Y E

5.80 From Ex. 5.36,
2
1
1 1 1
) ( + = y y f , 0 y
1
1, and
2
1
2 2 2
) ( + = y y f , 0 y
2
1. Thus,
E(Y
1
) = 7/12 and E(Y
2
) = 7/12. So, E(30Y
1
+ 25Y
2
) = 30(7/12) + 25(7/12) = 32.08.


5.81 Since Y
1
and Y
2
are independent, E(Y
2
/Y
1
) = E(Y
2
)E(1/Y
1
). Thus, using the marginal
densities found in Ex. 5.61,
E(Y
2
/Y
1
) = E(Y
2
)E(1/Y
1
) = 1 ) ( 2
2
1
0
1
2 /
4
1
2
2 /
0
2 2
1 1 2
= =

dy e dy e y
y y
.

5.82 The marginal densities were found in Ex. 5.34. So,
E(Y
1
Y
2
) = E(Y
1
) E(Y
2
) = 1/2

1
0
2 2 2
) ln( dy y y = 1/2 1/4 = 1/4.

5.83 From Ex. 3.88 and 5.42, E(Y) = 2 1 = 1.


5.84 All answers use results proven for the geometric distribution and independence:
a. E(Y
1
) = E(Y
2
) = 1/p, E(Y
1
Y
2
) = E(Y
1
) E(Y
2
) = 0.
b. E(Y
1
2
) = E(Y
2
2
) = (1 p)/p
2
+ (1/p)
2
= (2 p)/p
2
. E(Y
1
Y
2
) = E(Y
1
)E(Y
2
) = 1/p
2
.
c. E[(Y
1
Y
2
)
2
] = E(Y
1
2
) 2E(Y
1
Y
2
) + E(Y
2
2
) = 2(1 p)/p
2
.
V(Y
1
Y
2
) = V(Y
1
) + V(Y
2
) = 2(1 p)/p
2
.
d. Use Tchebysheffs theorem with k = 3.


5.85 a. E(Y
1
) = E(Y
2
) = 1 (both marginal distributions are exponential with mean 1)
b. V(Y
1
) = V(Y
2
) = 1
c. E(Y
1
Y
2
) = E(Y
1
) E(Y
2
) = 0.
d. E(Y
1
Y
2
) = 1 /4, so Cov(Y
1
, Y
2
) = /4.
e. V(Y
1
Y
2
) = V(Y
1
) + V(Y
2
) 2Cov(Y
1
, Y
2
) = 1 + /2. Using Tchebysheffs theorem
with k = 2, the interval is ) 2 / 2 2 , 2 / 2 2 ( + + .


5.86 Using the hint and Theorem 5.9:
a. E(W) = E(Z)E(
2 / 1
1

Y ) = 0E(
2 / 1
1

Y ) = 0. Also, V(W) = E(W


2
) [E(W)]
2
= E(W
2
).
Now, E(W
2
) = E(Z
2
)E(
1
1

Y ) = 1E(
1
1

Y ) = E(
1
1

Y ) =
2
1
1

,
1
> 2 (using Ex. 4.82).
b. E(U) = E(Y
1
)E(
1
2

Y ) =
2
2
1

,
2
> 2, V(U) = E(U
2
) [E(U)]
2
= E(Y
1
2
)E(
2
2

Y ) ( )
2
2
2
1


=
) 4 )( 2 (
1
1 1
2 2
) 2 (

+ ( )
2
2
2
1

=
) 4 ( ) 2 (
) 2 ( 2
2
2
2
2 ` 1

+
,
2
> 4.

Chapter 5: Multivariate Probability Distributions 107
Instructors Solutions Manual

5.87 a. E(Y
1
+ Y
2
) = E(Y
1
) + E(Y
2
) =
1
+
2
.
b. By independence, V(Y
1
+ Y
2
) = V(Y
1
) + V(Y
2
) = 2
1
+ 2
2
.


5.88 It is clear that E(Y) = E(Y
1
) + E(Y
2
) + + E(Y
6
). Using the result that Yi follows a
geometric distribution with success probability (7 i)/6, we have
E(Y) =

=

6
1
7
6
i
i
= 1 + 6/5 + 6/4 + 6/3 + 6/2 + 6 = 14.7.

5.89 Cov(Y
1,
Y
2
) = E(Y
1
Y
2
) E(Y
1
)E(Y
2
) = ) , (
2 1 2 1
1 2
y y p y y
y y

[2(1/3)]
2
= 2/9 4/9 = 2/9.
As the value of Y
1
increases, the value of Y
2
tends to decrease.


5.90 From Ex. 5.3 and 5.21, E(Y
1
) = 4/3 and E(Y
2
) = 1. Thus,
E(Y
1
Y
2
) = 1 ) 2 ( 1 ) 1 ( 2 ) 1 ( 1
84
18
84
12
84
24
= + +
So, Cov(Y
1,
Y
2
) = E(Y
1
Y
2
) E(Y
1
)E(Y
2
) = 1 (4/3)(1) = 1/3.

5.91 From Ex. 5.76, E(Y
1
) = E(Y
2
) = 2/3. E(Y
1
Y
2
) =
2 1
2
2
1
0
1
0
2
1
4 dy dy y y

= 4/9. So,
Cov(Y
1,
Y
2
) = E(Y
1
Y
2
) E(Y
1
)E(Y
2
) = 4/9 4/9 = 0 as expected since Y
1
and

Y
2
are
independent.

5.92 From Ex. 5.77, E(Y
1
) = 1/4 and E(Y
2
) = 1/2. E(Y
1
Y
2
) =
2 1 2 2
1
0 0
1
) 1 ( 6
2
dy dy y y y
y


= 3/20.
So, Cov(Y
1,
Y
2
) = E(Y
1
Y
2
) E(Y
1
)E(Y
2
) = 3/20 1/8 = 1/40 as expected since Y
1
and

Y
2
are
dependent.


5.93 a. From Ex. 5.55 and 5.79, E(Y
1
Y
2
) = 0 and E(Y
1
) = 0. So,
Cov(Y
1,
Y
2
) = E(Y
1
Y
2
) E(Y
1
)E(Y
2
) = 0 0E(Y
2
) = 0.
b. Y
1
and

Y
2
are dependent.
c. Since Cov(Y
1,
Y
2
) = 0, = 0.
d. If Cov(Y
1,
Y
2
) = 0, Y
1
and

Y
2
are not necessarily independent.


5.94 a. Cov(U
1,
U
2
) = E[(Y
1
+ Y
2
)(Y
1
Y
2
)] E(Y
1
+ Y
2
)E(Y
1
Y
2
)
= E(Y
1
2
) E(Y
2
2
) [E(Y
1
)]
2
[E(Y
2
)]
2

= (
2
1
2
1
+ ) (
2
2
2
2
+ ) (
2
2
2
1
) =
2
2
2
1
.
b. Since V(U
1
) = V(U
2
) =
2
2
2
1
+ (Y
1
and Y
2
are uncorrelated),
2
2
2
1
2
2
2
1
+

= .
c. If
2
2
2
1
= , U
1
and U
2
are uncorrelated.

108 Chapter 5: Multivariate Probability Distributions
Instructors Solutions Manual

5.95 Note that the marginal distributions for Y
1
and Y
2
are
y
1
1 0 1 y
2
0 1
p
1
(y
1
) 1/3 1/3 1/3 p
2
(y
2
) 2/3 1/3

So, Y
1
and Y
2
not independent since p(1, 0) p
1
(1)p
2
(0). However, E(Y
1
) = 0 and
E(Y
1
Y
2
) = (1)(0)1/3 + (0)(1)(1/3) + (1)(0)(1/3) = 0, so Cov(Y
1,
Y
2
) = 0.

5.96 a. Cov(Y
1,
Y
2
) = E[(Y
1

1
)(Y
2

2
)] = E[(Y
2

2
)(Y
1

1
)] = Cov(Y
2,
Y
1
).
b. Cov(Y
1,
Y
1
) = E[(Y
1

1
)(Y
1

1
)] = E[(Y
1

1
)
2
] = V(Y
1
).

5.97 a. From Ex. 5.96, Cov(Y
1,
Y
1
) = V(Y
1
) = 2.
b. If Cov(Y
1,
Y
2
) = 7, = 7/4 > 1, impossible.
c. With = 1, Cov(Y
1,
Y
2
) = 1(4) = 4 (a perfect positive linear association).
d. With = 1, Cov(Y
1,
Y
2
) = 1(4) = 4 (a perfect negative linear association).

5.98 Since
2
1, we have that 1 1 or 1
) ( ) (
) , ( Cov
2 1
2 1
Y V Y V
Y Y
1.

5.99 Since E(c) = c, Cov(c
,
Y) = E[(c c)(Y

)] = 0.

5.100 a. E(Y
1
) = E(Z) = 0, E(Y
2
) = E(Z
2
) = 1.
b. E(Y
1
Y
2
) = E(Z
3
) = 0 (odd moments are 0).
c. Cov(Y
1,
Y
1
) = E(Z
3
) E(Z)E(Z
2
) = 0.
d. P(Y
2
> 1 | Y
1
> 1) = P(Z
2
> 1 | Z > 1) = 1 P(Z
2
> 1). Thus, Y
1
and

Y
2
are dependent.

5.101 a. Cov(Y
1,
Y
2
) = E(Y
1
Y
2
) E(Y
1
)E(Y
2
) = 1 /4 (1)(1) =
4

.
b. This is clear from part a.
c. We showed previously that Y
1
and

Y
2
are independent only if = 0. If = 0, if must be
true that = 0.

5.102 The quantity 3Y
1
+ 5Y
2
= dollar amount spend per week. Thus:
E(3Y
1
+ 5Y
2
) = 3(40) + 5(65) = 445.
E(3Y
1
+ 5Y
2
) = 9V(Y
1
) + 25V(Y
2
) = 9(4) + 25(8) = 236.

5.103 E(3Y
1
+ 4Y
2
6Y
3
) = 3E(Y
1
) + 4E(Y
2
) 6E(Y
3
) = 3(2) + 4(1) 6(4) = 22,
V(3Y
1
+ 4Y
2
6Y
3
) = 9V(Y
1
) + 16V(Y
2
) + 36E(Y
3
) + 24Cov(Y
1
, Y
2
) 36Cov(Y
1
, Y
3
)
48Cov(Y
2
, Y
3
) = 9(4) + 16(6) + 36(8) + 24(1) 36(1) 48(0) = 480.

5.104 a. Let X = Y
1
+ Y
2
. Then, the probability distribution for X is
x 1 2 3
p(x) 7/84 42/84 35/84
Thus, E(X) = 7/3 and V(X) = .3889.

b. E(Y
1
+ Y
2
) = E(Y
1
) + E(Y
2
) = 4/3 + 1 = 7/3. We have that V(Y
1
) = 10/18, V(Y
2
) = 42/84,
and Cov(Y
1,
Y
1
) = 1/3, so
Chapter 5: Multivariate Probability Distributions 109
Instructors Solutions Manual

V(Y
1
+ Y
2
) = V(Y
1
) + V(Y
2
) + 2Cov(Y
2
, Y
3
) = 10/18 + 42/84 2/3 = 7/18 = .3889.
5.105 Since Y
1
and

Y
2
are independent, V(Y
1
+

Y
2
) = V(Y
1
) + V(Y
1
) = 1/18 + 1/18 = 1/9.

5.106 V(Y
1


3Y
2
) = V(Y
1
) + 9V(Y
2
) 6Cov(Y
1
, Y
2
) = 3/80 + 9(1/20) 6(1/40) = 27/80 = .3375.

5.107 Since E(Y
1
) = E(Y
2
) = 1/3, V(Y
1
) = V(Y
2
) = 1/18 and E(Y
1
Y
2
) =

1
0
2 1
1
0
2 1
2
2 dy dy y y
y
= 1/12,
we have that Cov(Y
1,
Y
1
) = 1/12 1/9 = 1/36. Therefore,

E(Y
1
+ Y
2
) = 1/3 + 1/3 = 2/3 and V(Y
1
+ Y
2
) = 1/18 + 1/18 + 2(1/36) = 1/18.

5.108 From Ex. 5.33, Y
1
has a gamma distribution with = 2 and = 1, and Y
2
has an
exponential distribution with = 1. Thus, E(Y
1
+ Y
2
) = 2(1) + 1 = 3. Also, since
E(Y
1
Y
2
) = 3
0
1 2
0
2 1
1
1
=

dy dy e y y
y
y
, Cov(Y
1,
Y
1
) = 3 2(1) = 1,
V(Y
1
Y
2
) = 2(1)
2
+ 1
2
2(1) = 1.
Since a value of 4 minutes is four three standard deviations above the mean of 1 minute,
this is not likely.

5.109 We have E(Y
1
) = E(Y
2
) = 7/12. Intermediate calculations give V(Y
1
) = V(Y
2
) = 11/144.
Thus, E(Y
1
Y
2
) = 3 / 1 ) (
1
0
2 1
1
0
2 1 2 1
= +

dy dy y y y y , Cov(Y
1,
Y
1
) = 1/3 (7/12)
2
= 1/144.
From Ex. 5.80, E(30Y
1
+ 25Y
2
) = 32.08, so

V(30Y
1
+ 25Y
2
) = 900V(Y
1
) + 625V(Y
2
) + 2(30)(25) Cov(Y
1,
Y
1
) = 106.08.

The standard deviation of 30Y
1
+ 25Y
2
is 08 . 106 = 10.30. Using Tchebysheffs
theorem with k = 2, the interval is (11.48, 52.68).


5.110 a. V(1 + 2Y
1
) = 4V(Y
1
), V(3 + 4Y
2
) = 16V(Y
2
), and Cov(1 + 2Y
1
, 3 + 4Y
2
) = 8Cov(Y
1
, Y
2
).
So, 2 .
) ( 16 ) ( 4
) , ( 8Cov
2 1
2 1
= =
Y V Y V
Y Y
.

b. V(1 + 2Y
1
) = 4V(Y
1
), V(3 4Y
2
) = 16V(Y
2
), and Cov(1 + 2Y
1
, 3 4Y
2
) = 8Cov(Y
1
, Y
2
).
So, 2 .
) ( 16 ) ( 4
) , ( 8Cov -
2 1
2 1
= =
Y V Y V
Y Y
.

c. V(1 2Y
1
) = 4V(Y
1
), V(3 4Y
2
) = 16V(Y
2
), and Cov(1 2Y
1
, 3 4Y
2
) = 8Cov(Y
1
, Y
2
).
So, 2 .
) ( 16 ) ( 4
) , ( 8Cov
2 1
2 1
= =
Y V Y V
Y Y
.


110 Chapter 5: Multivariate Probability Distributions
Instructors Solutions Manual


5.111 a. V(a + bY
1
) = b
2
V(Y
1
), V(c + dY
2
) = d
2
V(Y
2
), and Cov(a + bY
1
, c + dY
2
) = bdCov(Y
1
, Y
2
).
So,
2 1 2 1
,
2
2
1
2
2 1
,
| |
) ( ) (
) , ( Cov
Y Y W W
bd
bd
Y V d Y V b
Y Y bd
= = . Provided that the constants b and d are
nonzero,
| | bd
bd
is either 1 or 1. Thus, | | | |
2 1 2 1
, , Y Y W W
= .

b. Yes, the answers agree.


5.112 In Ex. 5.61, it was showed that Y
1
and Y
2
are independent. In addition, Y
1
has a gamma
distribution with = 2 and = 2, and Y
2
has an exponential distribution with = 2. So,
with C = 50 + 2Y
1
+ 4Y
2
, it is clear that
E(C) = 50 + 2E(Y
1
) + 4E(Y
2
) = 50 + (2)(4) + (4)(2) = 66
V(C) = 4V(Y
1
) + 16V(Y
2
) = 4(2)(4) + 16(4) = 96.

5.113 The net daily gain is given by the random variable G = X Y. Thus, given the
distributions for X and Y in the problem,

E(G) = E(X) E(Y) = 50 (4)(2) = 42
V(G) = V(G) + V(G) = 3
2
+ 4(2
2
) = 25.

The value $70 is (70 42)/5 = 7.2 standard deviations above the mean, an unlikely value.

5.114 Observe that Y
1
has a gamma distribution with = 4 and = 1 and Y
2
has an exponential
distribution with = 2. Thus, with U = Y
1
Y
2
,

a. E(U) = 4(1) 2 = 2
b. V(U) = 4(1
2
) + 2
2
= 8
c. The value 0 has a zscore of (0 2)/ 8 = .707, or it is .707 standard deviations
below the mean. This is not extreme so it is likely the profit drops below 0.

5.115 Following Ex. 5.88:
a. Note that for nonnegative integers a and b and i j,

P(Y
i
= a, Y
j
= b) = P(Y
j
= b | Y
i
= a)P(Y
i
= a)

But, P(Y
j
= b | Y
i
= a) = P(Y
j
= b) since the trials (i.e. die tosses) are independent
the experiments that generate Y
i
and Y
j
represent independent experiments via the
memoryless property. So, Y
i
and Y
j
are independent and thus Cov(Y
i
. Y
j
) = 0.

b. V(Y) = V(Y
1
) + + V(Y
6
) = 0 +
2 2 2 2 2
) 6 / 1 (
6 / 5
) 6 / 2 (
6 / 4
) 6 / 3 (
6 / 3
) 6 / 4 (
6 / 2
) 6 / 5 (
6 / 1
+ + + + = 38.99.

c. From Ex. 5.88, E(Y) = 14.7. Using Tchebysheffs theorem with k = 2, the interval is
14.7 2 99 . 38 or (0 , 27.188)
Chapter 5: Multivariate Probability Distributions 111
Instructors Solutions Manual


5.116 V(Y
1
+ Y
2
) = V(Y
1
) + V(Y
2
) + 2Cov(Y
1
, Y
2
), V(Y
1
Y
2
) = V(Y
1
) + V(Y
2
) 2Cov(Y
1
, Y
2
).
When Y
1
and Y
2
are independent, Cov(Y
1
, Y
2
) = 0 so the quantities are the same.


5.117 Refer to Example 5.29 in the text. The situation here is analogous to drawing n balls
from an urn containing N balls, r
1
of which are red, r
2
of which are black, and N r
1
r
2

are neither red nor black. Using the argument given there, we can deduce that:
E(Y
1
) = np
1
V(Y
1
) = np
1
(1 p
1
) ( )
1

N
n N
where p
1
= r
1
/N
E(Y
2
) = np
2
V(Y
2
) = np
2
(1 p
2
) ( )
1

N
n N
where p
2
= r
2
/N
Now, define new random variables for i = 1, 2, , n:

=
otherwise 0
female mature a is alligator if 1 i
U
i

=
otherwise 0
male mature a is alligator if 1 i
V
i

Then,

=
=
n
i
i
U Y
1
1
and

=
=
n
i
i
V Y
1
2
. Now, we must find Cov(Y
1
, Y
2
). Note that:
E(Y
1
Y
2
) = E

=
n
i
i
U
1
,

=
n
i
i
V
1
=

=
n
i
i i
V U E
1
) ( +

j i
j i
V U E ) ( .

Now, since for all i, E(U
i
, V
i
) = P(U
i
= 1, V
i
= 1) = 0 (an alligator cant be both female
and male), we have that E(U
i
, V
i
) = 0 for all i. Now, for i j,

E(U
i
, V
j
) = P(U
i
= 1, V
i
= 1) = P(U
i
= 1)P(V
i
= 1|U
i
= 1) = ( )
2 1 1 1
2 1
p p
N
N
N
r
N
r

= .

Since there are n(n 1) terms in

j i
j i
V U E ) ( , we have that E(Y
1
Y
2
) = n(n 1)
2 1 1
p p
N
N

.
Thus, Cov(Y
1
, Y
2
) = n(n 1)
2 1 1
p p
N
N

(np
1
)(np
2
) =
2 1 1
) (
p p
N
n N n

.

So, [ ] ( )
2 1
1 2 1
np np E
n n
Y
n
Y
= =
2 1
p p ,

[ ] [ ] ) , ( Cov 2 ) ( ) (
2 1 2 1
1
2
2 1
Y Y Y V Y V V
n
n
Y
n
Y
+ = = ( )
2
2 1 2 1 ) 1 (
) ( p p p p
N n
n N
+



5.118 Let Y = X
1
+ X
2
, the total sustained load on the footing.
a. Since X
1
and X
2
have gamma distributions and are independent, we have that
E(Y) = 50(2) + 20(2) = 140
V(Y) = 50(2
2
) + 20(2
2
) = 280.

b. Consider Tchebysheffs theorem with k = 4: the corresponding interval is

140 + 4 280 or (73.07, 206.93).

So, we can say that the sustained load will exceed 206.93 kips with probability less
than 1/16.
112 Chapter 5: Multivariate Probability Distributions
Instructors Solutions Manual


5.119 a. Using the multinomial distribution with p
1
= p
2
= p
3
= 1/3,
P(Y
1
= 3, Y
2
= 1, Y
3
= 2) = ( )
6
3
1
! 2 ! 1 ! 3
! 6
= .0823.
b. E(Y
1
) = n/3, V(Y
1
) = n(1/3)(2/3) = 2n/9.
c. Cov(Y
2
, Y
3
) = n(1/3)(1/3) = n/9.
d. E(Y
2


Y
3
) = n/3 n/3 = 0, V(Y
2


Y
3
) = V(Y
2
) + V(Y
3
) 2Cov(Y
2
, Y
3
) = 2n/3.

5.120 E(C) = E(Y
1
) + 3E(Y
2
) = np
1
+ 3np
2
.
V(C) = V(Y
1
) + 9V(Y
2
) + 6Cov(Y
1
, Y
2
) = np
1
q
1
+ 9np
2
q
2
6np
1
p
2
.

5.121 If N is large, the multinomial distribution is appropriate:
a. P(Y
1
= 2, Y
2
= 1) = 0972 . ) 6 (. ) 1 (. ) 3 (.
2 1 2
! 2 ! 1 ! 2
! 5
= .

b. [ ]=
n
Y
n
Y
E
2 1
=
2 1
p p = .3 .1 = .2
[ ] [ ] ) , ( Cov 2 ) ( ) (
2 1 2 1
1
2
2 1
Y Y Y V Y V V
n
n
Y
n
Y
+ = =
n
p p
n
q p
n
q p
2 1 2 2 1 1
2 + + = .072.

5.122 Let Y
1
= # of mice weighing between 80 and 100 grams, and let Y
2
= # weighing over 100
grams. Thus, with X having a normal distribution with = 100 g. and = 20 g.,
p
1
= P(80 X 100) = P(1 Z 0) = .3413
p
2
= P(X > 100) = P(Z > 0) = .5
a. P(Y
1
= 2, Y
2
= 1) = 1109 . ) 1587 (. ) 5 (. ) 3413 (.
1 1 2
! 1 ! 1 ! 2
! 4
= .

b. P(Y
2
= 4) = 0625 . ) 5 (.
4
! 0 ! 4 ! 0
! 4
= .

5.123 Let Y
1
= # of family home fires, Y
2
= # of apartment fires, and Y
3
= # of fires in other
types. Thus, (Y
1
, Y
2
, Y
3
) is multinomial with n = 4, p
1
= .73, p
2
= .2 and p
3
= .07. Thus,
P(Y
1
= 2, Y
2
= 1, Y
3
= 1) = 6(.73)
2
(.2)(.07) = .08953.

5.124 Define C = total cost = 20,000Y
1
+ 10,000Y
2
+ 2000Y
3

a. E(C) = 20,000E(Y
1
) + 10,000E(Y
2
) + 2000E(Y
3
)
= 20,000(2.92) + 10,000(.8) + 2000(.28) = 66,960.

b. V(C) = (20,000)
2
V(Y
1
) + (10,000)
2
V(Y
2
) + (2000)
2
V(Y
3
) + covariance terms
= (20,000)
2
(4)(.73)(.27) + (10,000)
2
(4)(.8)(.2) + (2000)
2
(4)(.07)(.93)
+ 2[20,000(10,000)(4)(.73)(.2) + 20,000(2000)(4)(.73)(.07) +
10,000(2000)(4)(.2)(.07)] = 380,401,600 252,192,000 = 128,209,600.


5.125 Let Y
1
= # of planes with no wine cracks, Y
2
= # of planes with detectable wing cracks,
and Y
3
= # of planes with critical wing cracks. Therefore, (Y
1
, Y
2
, Y
3
) is multinomial with
n = 5, p
1
= .7, p
2
= .25 and p
3
= .05.
a. P(Y
1
= 2, Y
2
= 2, Y
3
= 1) = 30(.7)
2
(.25)
2
(.05) = .046.

b. The distribution of Y
3
is binomial with n = 5, p
3
= .05, so
Chapter 5: Multivariate Probability Distributions 113
Instructors Solutions Manual

P(Y
3
1) = 1 P(Y
3
= 0) = 1 (.95)
5
= .2262.
5.126 Using formulas for means, variances, and covariances for the multinomial:
E(Y
1
) = 10(.1) = 1 V(Y
1
) = 10(.1)(.9) = .9
E(Y
2
) = 10(.05) = .5 V(Y
2
) = 10(.05)(.95) = .475
Cov(Y
1
, Y
2
) = 10(.1)(.05) = .05
So,
E(Y
1
+ 3Y
2
) = 1 + 3(.5) = 2.5
V(Y
1
+ 3Y
2
) = .9 + 9(.475) + 6(.05) = 4.875.

5.127 Y is binomial with n = 10, p = .10 + .05 = .15.
a. P(Y = 2) =
8 2
) 85 (. ) 15 (.
2
10

= .2759.
b. P(Y 1) = 1 P(Y = 0) = 1 (.85)
10
= .8031.

5.128 The marginal distribution for Y
1
is found by


=
2 2 1 1 1
) , ( ) ( dy y y f y f .
Making the change of variables u = (y
1

1
)/
1
and v = (y
2

2
)/
2
yields


= dv uv v u y f ) 2 (
) 1 ( 2
1
exp
1 2
1
) (
2 2
2
2
1
1 1
.
To evaluate this, note that ) 1 ( ) ( 2
2 2 2 2 2
+ = + u u v uv v u so that


= dv u v e y f
u 2
2
2 /
2
1
1 1
) (
) 1 ( 2
1
exp
1 2
1
) (
2
,
So, the integral is that of a normal density with mean u and variance 1
2
. Therefore,
2
1
2
1 1
2 / ) (
1
1 1
2
1
) (

=
y
e y f , < y
1
< ,
which is a normal density with mean
1
and standard deviation
1
. A similar procedure
will show that the marginal distribution of Y
2
is normal with mean
2
and standard
deviation
2
.

5.129 The result follows from Ex. 5.128 and defining ) ( / ) , ( ) | (
2 2 2 1 2 1
y f y y f y y f = , which
yields a density function of a normal distribution with mean ) )( / (
2 2 2 1 1
+ y and
variance ) 1 (
2 2
1
.

5.130 a.

= = = =
= = =
n
i
j i
n
i
i j i
n
i
n
j
j i j i
b a Y V b a Y Y b a U U
1 1
2
1 1
2 1
) ( ) , ( Cov ) , ( Cov , since the Y
i
s are
independent. If ) , ( Cov
2 1
U U = 0, it must be true that

=
n
i
j i
b a
1
= 0 since
2
> 0. But, it is
trivial to see if

=
n
i
j i
b a
1
= 0, ) , ( Cov
2 1
U U = 0. So, U
1
and U
2
are orthogonal.

114 Chapter 5: Multivariate Probability Distributions
Instructors Solutions Manual


b. Given in the problem, ) , (
2 1
U U has a bivariate normal distribution. Note that
E(U
1
) =

n
i
i
a
1
, E(U
2
) =

n
i
i
b
1
, V(U
1
) =

n
i
i
a
1
2 2
, and V(U
2
) =

n
i
i
b
1
2 2
. If they are
orthogonal, ) , ( Cov
2 1
U U = 0 and then . 0
2 1
,
=
U U
So, they are also independent.

5.131 a. The joint distribution of Y
1
and Y
2
is simply the product of the marginals ) (
1 1
y f and
) (
2 2
y f since they are independent. It is trivial to show that this product of density has
the form of the bivariate normal density with = 0.

b. Following the result of Ex. 5.130, let a
1
= a
2
= b
1
= 1 and b
2
= 1. Thus,

=
n
i
j i
b a
1
= 0
so U
1
and U
2
are independent.

5.132 Following Ex. 5.130 and 5.131, U
1
is normal with mean
1
+
2
and variance 2
2
and U
2

is normal with mean
1

2
and variance 2
2
.

5.133 From Ex. 5.27,
2 2 1
/ 1 ) | ( y y y f = , 0 y
1
y
2
and ) 1 ( 6 ) (
2 2 2 2
y y y f = , 0 y
2
1.
a. To find ) | (
2 2 1
y Y Y E = , note that the conditional distribution of Y
1
given Y
2
is uniform
on the interval (0, y
2
). So, ) | (
2 2 1
y Y Y E = =
2
2
y
.
b. To find )) | ( (
2 1
Y Y E E , note that the marginal distribution is beta with = 2 and = 2.
So, from part a, )) | ( (
2 1
Y Y E E = E(Y
2
/2) = 1/4. This is the same answer as in Ex. 5.77.

5.134 The zscore is (6 1.25)/ 5625 . 1 = 3.8, so the value 6 is 3.8 standard deviations above
the mean. This is not likely.

5.135 Refer to Ex. 5.41:
a. Since Y is binomial, E(Y|p) = 3p. Now p has a uniform distribution on (0, 1), thus
E(Y) = E[E(Y|p)] = E(3p) = 3(1/2) = 3/2.
b. Following part a, V(Y|p) = 3p(1 p). Therefore,
V(p) = E[3p(1 p)] + V(3p) = 3E(p p
2
) + 9V(p)
= 3E(p) 3[V(p) + (E(p))
2
] + 9V(p) = 1.25

5.136 a. For a given value of , Y has a Poisson distribution. Thus, E(Y | ) = . Since the
marginal distribution of is exponential with mean 1, E(Y) = E[E(Y | )] = E() = 1.
b. From part a, E(Y | ) = and so V(Y | ) = . So, V(Y) = E[V(Y | )] + E[V(Y | )] = 2
c. The value 9 is (9 1)/ 2 = 5.657 standard deviations above the mean (unlikely score).

5.137 Refer to Ex. 5.38: ) | (
1 1 2
y Y Y E = = y
1
/2. For y
1
= 3/4, ) 4 / 3 | (
1 2
= Y Y E = 3/8.

5.138 If Y = # of bacteria per cubic centimeter,
a. E(Y) = E(Y) = E[E(Y | )] = E() = .
Chapter 5: Multivariate Probability Distributions 115
Instructors Solutions Manual

b. V(Y) = E[V(Y | )] + V[E(Y | )] = +
2
= (1+). Thus, ) 1 ( + = .

5.139 a.

= =
= =

= =
n
i
i
n
i
i
n Y E Y E n N T E
1 1
) ( ) | ( .

b. = = = ) ( )] | ( [ ) ( N E N T E E T E . Note that this is E(N)E(Y).


5.140 Note that V(Y
1
) = E[V(Y
1
| Y
2
)] + V[E(Y
1
| Y
2
)], so E[V(Y
1
| Y
2
)] = V(Y
1
) V[E(Y
1
| Y
2
)].
Thus, E[V(Y
1
| Y
2
)] V(Y
1
).

5.141 E(Y
2
) = )) | ( (
1 2
Y Y E E = E(Y
1
/2) =
2


V(Y
2
) = E[V(Y
2
| Y
1
)] + V[E(Y
2
| Y
1
)] = E[ 12 /
2
1
Y ] + V[Y
1
/2] = (2
2
)/12 + (
2
)/2 =
2
2
3

.

5.142 a. E(Y) = E[E(Y|p)] = E(np) = nE(p) =
+
n
.

b. V(Y) = E[V(Y | p)] + V[E(Y | p)] = E[np(1 p)] + V(np) = nE(p p
2
) + n
2
V(p). Now:
nE(p p
2
) =
+
n

) 1 )( (
) 1 (
+ + +
+ n

n
2
V(p) =
) 1 ( ) (
2
2
+ + +
n
.

So, V(Y) =
+
n

) 1 )( (
) 1 (
+ + +
+ n
+
) 1 ( ) (
2
2
+ + +
n
=
) 1 ( ) (
) (
2
+ + +
+ + n n
.


5.143 Consider the random variable y
1
Y
2
for the fixed value of Y
1
. It is clear that y
1
Y
2
has a
normal distribution with mean 0 and variance
2
1
y and the mgf for this random variable is
2 /
2
1
2
2 1
) ( ) (
y t Y ty
e e E t m = = .
Thus,
( )
1
) 1 ( 2 /
2
1
2 /
1
2 2
1
2
1 2 1 2 1
) ( )] | ( [ ) ( ) ( ) ( dy e e E Y e E E e E e E t m
t y tY Y tY Y tY tU
U

= = = = = .
Note that this integral is essentially that of a normal density with mean 0 and variance
2
1
1
t
, so the necessary constant that makes the integral equal to 0 is the reciprocal of the
standard deviation. Thus, ( )
2 / 1
2
1 ) (

= t t m
U
. Direct calculations give 0 ) 0 ( =
U
m and
1 ) 0 ( =
U
m . To compare, note that E(U) = E(Y
1
Y
2
) = E(Y
1
)E(Y
2
) = 0 and V(U) = E(U
2
) =
E(Y
1
2
Y
2
2
) = E(Y
1
2
)E(Y
2
2
) = (1)(1) = 1.


116 Chapter 5: Multivariate Probability Distributions
Instructors Solutions Manual

5.144

= = =
1 2 1 2
) ( ) ( ) ( ) ( ) , ( ) ( ) ( )] ( ) ( [
2 2 1 1 2 1 2 1 2 1 2 1
y y y y
y p y p y h y g y y p y h y g Y h Y g E
)] ( [ )] ( [ ) ( ) ( ) ( ) (
2 1 2 2 2 1 1 1
1 2
Y h E Y g E y p y h y p y g
y y
=

.
5.145 The probability of interest is P(Y
1
+ Y
2
< 30), where Y
1
is uniform on the interval (0, 15)
and Y
2
is uniform on the interval (20, 30). Thus, we have
2 1
30
20
2 30
0
2 1
10
1
15
1
) 30 ( dy dy Y Y P
y

= < +

= 1/3.

5.146 Let (Y
1
, Y
2
) represent the coordinates of the landing point of the bomb. Since the radius
is one mile, we have that 0
2
2
2
1
y y + 1. Now,
P(target is destroyed) = P(bomb destroys everything within 1/2 of landing point)
This is given by ) ) ( (
2
2
1
2
2
2
1
+Y Y P . Since (Y
1
, Y
2
) are uniformly distributed over the unit
circle, the probability in question is simply the area of a circle with radius 1/2 divided by
the area of the unit circle, or simply 1/4.


5.147 Let Y
1
= arrival time for 1
st
friend, 0 y
1
1, Y
2
= arrival time for 2
nd
friend, 0 y
2
1.
Thus f

(y
1
, y
2
) = 1. If friend 2 arrives 1/6 hour (10 minutes) before or after friend 1, they
will meet. We can represent this event as |Y
1
Y
2
| < 1/3. To find the probability of this
event, we must find:
36 / 11 1 1 1 ) 3 / 1 | (|
1
1
6 / 5
1
6 / 1
2 1
6 / 5
6 / 1
6 / 1
6 / 1
2 1
6 / 1
0
6 / 1
0
2 2 1
1
1
1
1
= + + = <


+

+
dy dy dy dy dy dy Y Y P
y
y
y
y
.

5.148 a.

=
3
9
3
2 3 4
2 1
2 1 2 1
) , (
y y y y
y y p , y
1
= 0, 1, 2, 3, y
2
= 0, 1, 2, 3, y
1
+ y
2
3.
b. Y
1
is hypergeometric w/ r = 4, N = 9, n = 3; Y
2
is hypergeometric w/ r = 3, N = 9, n = 3

c. P(Y
1
= 1 | Y
2
1) = [p(1, 1) + p(1, 2)]/[1 p
2
(0)] = 9/16


5.149 a.
2
1
0
2 1 1 1
3 3 ) (
1
y dy y y f
y
= =

, 0 y
1
1, ) 1 ( 3 ) (
2
2 2
3
1
1 1 1 1
2
y dy y y f
y
= =

, 0 y
2
1.
b. 44 / 23 ) 2 / 1 | 4 / 3 (
2 1
= Y Y P .
c. f(y
1
| y
2
) = ) 1 /( 2
2
2 1
y y , y
2
y
1
1.
d. 12 / 5 ) 2 / 1 | 4 / 3 (
2 1
= = Y Y P .


5.150 a. Note that f(y
2
| y
1
) = f(y
1
, y
2
)/f(y
1
) = 1/y
1
, 0 y
2
y
1
. This is the same conditional
density as seen in Ex. 5.38 and Ex. 5.137. So, E(Y
2
| Y
1
= y
1
) = y
1
/2.
Chapter 5: Multivariate Probability Distributions 117
Instructors Solutions Manual

b. E(Y
2
) = E[E(Y
2
| Y
1
)] = E(Y
1
/2) =
1
2
1
1
0
2
3
1
dy y
y

= 3/8.
c. E(Y
2
) =
2
2
2
1
0
2
3
2
) 1 ( dy y y

= 3/8.
5.151 a. The joint density is the product of the marginals:
+

=
/ ) (
1
2 1
2 1
2
) , (
y y
e y y f , y
1
, y
2

b.
2
0 0
1
/ ) (
1
2 1
2
2 1
2
) ( dy dy e a Y Y P
a y a
y y

= + = 1 [ ]

+
/
/ 1
a
e a .

5.152 The joint density of (Y
1
, Y
2
) is
2
2
2
1 1 2 1
) ( 18 ) , ( y y y y y f = , 0 y
1
1, 0 y
2
1. Thus,
P(Y
1
Y
2
.5) = P(Y
1
.5/Y
2
) = 1 P(Y
1
> .5/Y
2
) = 1


1
5 .
1
/ 5 .
2 1
2
2
2
1 1
2
) ( 18
y
dy dy y y y . Using
straightforward integration, this is equal to (5 3ln2)/4 = .73014.

5.153 This is similar to Ex. 5.139:
a. Let N = # of eggs laid by the insect and Y = # of eggs that hatch. Given N = n, Y has a
binomial distribution with n trials and success probability p. Thus, E(Y | N = n) = np.
Since N follows as Poisson with parameter , E(Y) = E[E(Y | N )] = E(Np ) = p.

b. V(Y) = E[V(Y | N)] + V[E(Y | N)] = E[Np(1 p)] + V[Np] = p.

5.154 The conditional distribution of Y given p is binomial with parameter p, and note that the
marginal distribution of p is beta with = 3 and = 2.
a. Note that dp p p
y
n
dp p f p y f p y f y f
y n y 1
1
0
2
1
0
1
0
) 1 ( 12 ) ( ) | ( ) , ( ) (
+ +

= = =

. This
integral can be evaluated by relating it to a beta density w/ = y + 3, = n + y + 2.
Thus,
) 5 (
) 3 ( ) 2 (
12 ) (
+
+ +

=
n
y y n
y
n
y f , y = 0, 1, 2, , n.

b. For n = 2, E(Y | p) = 2p. Thus, E(Y) = E[E(Y|p)] = E(2p) = 2E(p) = 2(3/5) = 6/5.


5.155 a. It is easy to show that

Cov(W
1
, W
2
) = Cov(Y
1
+ Y
2
, Y
1
+ Y
3
)
= Cov(Y
1
, Y
1
) + Cov(Y
1
, Y
3
) + Cov(Y
2
, Y
1
) + Cov(Y
2
, Y
3
)
= Cov(Y
1
, Y
1
) = V(Y
1
) = 2
1
.

b. It follows from part a above (i.e. the variance is positive).


118 Chapter 5: Multivariate Probability Distributions
Instructors Solutions Manual

5.156 a. Since E(Z) = E(W) = 0, Cov(Z, W) = E(ZW) = E(Z
2 2 / 1
Y ) = E(Z
2
)E(
2 / 1
Y ) = E(
2 / 1
Y ).
This expectation can be found by using the result Ex. 4.112 with a = 1/2. So,
Cov(Z, W) = E(
2 / 1
Y ) =
) ( 2
) (
2
2
1
2


, provided > 1.

b. Similar to part a, Cov(Y, W) = E(YW) = E( Y W) = E( Y )E(W) = 0.

c. This is clear from parts (a) and (b) above.



5.157
( )

+
+

+ +
+
+
=
+

= =

) ( ) 1 (
) (
) ( ) 1 (
) ( ) | ( ) (
1
0
] / ) 1 [( 1
0
y
y
d
y
e
d f y p y p
y
y
, y = 0, 1, 2, . Since
it was assumed that was an integer, this can be written as

+
=
1
1
1
1
) (
y
y
y
y p , y = 0, 1, 2, .

5.158 Note that for each X
i
, E(X
i
) = p and V(X
i
) = pq. Then, E(Y) = E(X
i
) = np and V(Y) = npq.
The second result follows from the fact that the X
i
are independent so therefore all
covariance expressions are 0.


5.159 For each W
i
, E(W
i
) = 1/p and V(W
i
) = q/p
2
. Then, E(Y) = E(X
i
) = r/p and V(Y) = rq/p
2
.
The second result follows from the fact that the W
i
are independent so therefore all
covariance expressions are 0.


5.160 The marginal probabilities can be written directly:

P(X
1
= 1) = P(select ball 1 or 2) = .5 P(X
1
= 0) = .5
P(X
2
= 1) = P(select ball 1 or 3) = .5 P(X
2
= 0) = .5
P(X
3
= 1) = P(select ball 1 or 4) = .5 P(X
3
= 0) = .5

Now, for i j, X
i
and X
j
are clearly pairwise independent since, for example,

P(X
1
= 1, X
2
= 1) = P(select ball 1) = .25 = P(X
1
= 1)P(X
2
= 1)
P(X
1
= 0, X
2
= 1) = P(select ball 3) = .25 = P(X
1
= 0)P(X
2
= 1)

However, X
1
, X
2
, and X
3
are not mutually independent since

P(X
1
= 1, X
2
= 1, X
3
= 1) = P(select ball 1) = .25 P(X
1
= 1)P(X
2
= 1)P(X
1
= 3).


Chapter 5: Multivariate Probability Distributions 119
Instructors Solutions Manual

5.161
2 1
1 1
) ( ) ( ) ( ) ( ) ( = = =
i m i n
X E Y E X E Y E X Y E
m n X V Y V X V Y V X Y V
i
m
i
n
/ / ) ( ) ( ) ( ) ( ) (
2
2
2
1
1 1
2 2
+ = + = + =

5.162 Using the result from Ex. 5.65, choose two different values for with 1 1.
5.163 a. The distribution functions with the exponential distribution are:

1
1 ) (
1 1
y
e y F

= , y
1
0;
2
1 ) (
2 2
y
e y F

= , y
2
0.
Then, the joint distribution function is
)] )( ( 1 ][ 1 ][ 1 [ ) , (
2 1 2 1
2 1
y y y y
e e e e y y F

= .
Finally, show that ) , (
2 1
2 1
2
y y F
y y

gives the joint density function seen in Ex. 5.162.

b. The distribution functions with the uniform distribution on (0, 1) are:
) (
1 1
y F = y
1
, 0 y
1
1 ; ) (
2 2
y F = y
2
, 0 y
2
1.
Then, the joint distribution function is
)] 1 )( 1 ( 1 [ ) , (
2 1 2 1 2 1
y y y y y y F = .

c. ) , (
2 1
2 1
2
y y F
y y

= [ ] ) 2 1 )( 2 1 ( 1 ) , (
2 1 2 1
y y y y f = , 0 y
1
1, 0 y
2
1.

d. Choose two different values for with 1 1.

5.164 a. If t
1
= t
2
= t
3
= t, then m(t, t, t) = ( )
) (
3 2 1
X X X t
e E
+ +
. This, by definition, is the mgf for the
random variable X
1
+ X
2
+ X
3
.

b. Similarly with t
1
= t
2
= t and t
3
= 0, m(t, t, 0) = ( )
) (
2 1
X X t
e E
+
.

c. We prove the continuous case here (the discrete case is similar). Let (X
1
, X
2
, X
3
) be
continuous random variables with joint density function ) , , (
3 2 1
x x x f . Then,


=
3 2 1 3 2 1 3 2 1
) , , ( ) , , (
3 3 2 2 1 1
dx dx dx x x x f e e e t t t m
x t x t x t
.
Then,


= = =
+ +
=


3 2 1 3 2 1 3 2 1 0 3 2 1
3 2 1
) , , ( ) , , (
3 2 1
3 2 1 3 2 1
3 2 1
dx dx dx x x x f x x x t t t m
t t t
k k k
t t t k k k
k k k
.

This is easily recognized as ( )
3 2 1
3 2 1
k k k
X X X E .

5.165 a.
3 2 1 3 3 2 2 1 1
1 2 3
3 2 1
3 2 1 ! ! !
!
3 2 1
) , , (
x x x x t x t x t
x x x
x x x
n
p p p e t t t m
+ +

=
=
3 3 2 2 1 1
1 2 3
3 2 1
) ( ) ( ) (
3 2 1 ! ! !
!
x t x t x t
x x x
x x x
n
e p e p e p

=
n t t t
e p e p e p ) (
3 2 1
3 2 1
+ + . The
final form follows from the multinomial theorem.
120 Chapter 5: Multivariate Probability Distributions
Instructors Solutions Manual


b. The mgf for X
1
can be found by evaluating m(t, 0, 0). Note that q = p
2
+ p
3
= 1 p
1
.

c. Since Cov(X
1,
X
2
) = E(X
1
X
2
) E(X
1
)E(X
2
) and E(X
1
) = np
1
and E(X
2
) = np
2
since X
1
and

X
2
have marginal binomial distributions. To find E(X
1
X
2
), note that
2 1 0 2 1
2 1
2
) 1 ( ) 0 , , (
2 1
p p n n t t m
t t
t t
=


= =
.

Thus, Cov(X
1,
X
2
) = n(n 1)p
1
p
2
(np
1
)(np
2
) = np
1
p
2
.


5.166 The joint probability mass function of (Y
1
, Y
2
, Y
3
) is given by

=
n
N
y
Np
y
Np
y
Np
n
N
y
N
y
N
y
N
y y y p
3
3
2
2
1
1
3
3
2
2
1
1
3 2 1
) , , ( ,
where y
1
+ y
2
+ y
3
= n. The marginal distribution of Y
1
is hypergeometric with r = Np
1
, so
E(Y
1
) = np
1
, V(Y
1
) = np
1
(1p
1
) ( )
1

N
n N
. Similarly, E(Y
2
) = np
2
, V(Y
2
) = np
2
(1p
2
) ( )
1

N
n N
. It
can be shown that (using mathematical expectation and straightforward albeit messy
algebra) E(Y
1
Y
2
) =
1 2 1
) 1 (

N
N
p p n n . Using this, it is seen that
Cov(Y
1,
Y
2
) =
1 2 1
) 1 (

N
N
p p n n (np
1
)(np
2
) = np
1
p2( )
1

N
n N
.
(Note the similar expressions in Ex. 5.165.) Finally, it can be found that
) 1 )( 1 (
2 1
2 1
p p
p p

= .

5.167 a. For this exercise, the quadratic form of interest is
2 2
2 2 1
2 2
1
2
)] ( [ )] ( 2 [ ) ( Y E t Y Y E t Y E C Bt At + + = + + .
Since E[(tY
1
Y
2
)
2
] 0 (it is the integral of a nonnegative quantity), so we must have
that C Bt At + +
2
0. In order to satisfy this inequality, the two roots of this quadratic
must either be imaginary or equal. In terms of the discriminant, we have that
0 4
2
AC B , or
0 ) ( ) ( 4 )] ( 2 [
2
2
2
1
2
2 1
Y E Y E Y Y E .
Thus, ) ( ) ( )] ( [
2
2
2
1
2
2 1
Y E Y E Y Y E .

b. Let
1
= E(Y
1
),
2
= E(Y
2
), and define Z
1
= Y
1

1
, Z
2
= Y
2

2
. Then,

1
) ( ) (
)] ( [
] ) [( ] ) ( [
)] )( ( [
2
2
2
1
2
2 1
2
2 2
2
1 1
2
2 2 1 1 2
=


=
Z E Z E
Z Z E
Y E Y E
Y Y E

by the result in part a.

You might also like