Professional Documents
Culture Documents
Problems
2.
8 7
= 14/39,
13 12
8 5
p(0, 1) = p(1, 0) =
= 10/39
13 12
5 4
= 5/39
p(1, 1) =
13 12
(a) p(0, 0) =
(b) p(0, 0, 0) =
8 7 6
= 28/143
13 12 11
8 75
= 70/429
13 12 11
8 5 4
p(0, 1, 1) = p(1, 0, 1) = p(1, 1, 0) =
= 40/429
13 12 11
5 4 3
= 5/143
p(1, 1, 1) =
13 12 11
3.
if i + j + k = 2
5.
p(0, 0) = (12/13)3(11/12)3
p(0, 1) = p(1, 0) = (12/13)3[1 (11/12)3]
p(1, 1) = (2/13)[(1/13) + (12.13)(1/13)] + (11/13)(2/13)(1/13)
82
Copyright 2014 Pearson Education, Inc.
Chapter 6
83
y
8.
fY(y) = c ( y 2 x 2 )e y dx
y
4 3 y
cy e , 0 < y <
3
y 3e y
,0<y<
6
1
fX(x) =
( y 2 x 2 )e y dy
8x
1 x
e (1 x ) upon using y 2 e y y 2 e y 2 ye y 2e y
4
9.
(b) fX(x) =
6 2 xy
6
2
x dy (2 x x )
70
2
7
1 x
6 2 xy
15
(c) P{X > Y} =
x dydx
700
2
56
(d) P{Y > 1/2X < 1/2} = P{Y > 1/2, X < 1/2}/P{X < 1/2}
2 1/2
1/2 0
1/2
(2 x
xy
dxdy
2
x )dx
10.
11.
5!
(.45)2(.15)(.40)2
2! 1!2!
12.
e5 + 5e5 +
52 5 53 5
e e
2!
3!
84
14.
Chapter 6
Let X and Y denoted respectively the locations of the ambulance and the accident of the
moment the accident occurs.
P{Y X < a} = P{Y < X < Y + a} + P{X < Y < X + a}
L min( y a , L )
2
= 2
L
dxdy
L a y a
L L
dxdy
dxdy
L2 0 y
L a y
La a
a
a
=1
2 (L a) 2 , 0 < a < L
L
L
L
L
15.
(a) 1 =
16.
1
4
dydx
(a) A = Ai,
(b) yes
(c) P(A) =
P ( Ai ) = n(1/2)n1
17.
1
since each of the 3 points is equally likely to be the middle one.
3
18.
dydx
y x L /3
L
yL
2
L
0<x<
2
L /6 L
L /2 L
4
= 2
dydx
dydx
L 0 L /2
L /6 x L /3
4 L2 5L2 7 L2
= 7/9
72
L2 12 24
Chapter 6
19.
85
(a)
1
dydx
x
11
(b)
(c)
1
2
1
0
dx = 1
1
dy = 1, 0 < y < 1
x
1
0
y ln( y )dy 1
1
0
( y ln( y ) y )dy
(a) yes: fX(x) = xex, fY(y) = ey, 0 < x< , 0 < y <
1
fY(y) =
21.
f ( x, y )dxdy = 1. Now,
f ( x, y )dxdy
1 y
24 xy dxdy
1
0
12 y (1 y )2 dy
1
0
12( y 2 y 2 y 3 )dy
E[X] =
1
0
xf X ( x )dx
1 x
24 xy dydx
1
0
12 x 2 (1 x )2 dx = 2/5
(c) 2/5
86
22.
Chapter 6
(a) No, since the joint density does not factor.
(b) fX(x) =
1
0
1 x
1
0
( x y )dydx
[ x (1 x ) (1 x )2 / 2]dx = 1/3
(a) yes
fX(x) = 12x(1 x)
fY(y) = 12y
24.
(b) E[X] =
(c) E[Y] =
1
0
1
0
1
0
1
0
6 x 2 (1 x )dx = 1/2
2 y 2 dy = 2/3
(d) Var(X) =
(e) Var(Y) =
1
0
1
0
6 x 3 (1 x )dx 1 / 4 = 1/20
2 y 3 dy 4 / 9 = 1/18
P{N = n} = p0n 1 (1 p0 )
(b) P{X = j} = pj/(1 p0)
(c) P{N = n, X = j} = p0n 1 p j
25.
e 1
by the Poisson approximation to the binomial.
i!
26.
x 1
dadc
c x /a
0 a 1
0 c 1
1 x/a
dcda
0 0
dcda
x 0
= x x log x.
Hence, FAC(x) = x x log x and so
fAC(x) = log x , 0 < x < 1
Chapter 6
87
1 b2 /4
P{B /4 AC} =
2
log xdxdb
0
b 2 b2
2
log(b / 4)db
4
4
0
log 2 5
=
6
36
1
x 2 log xdx
ay
27.
P{X1/X2 < a} =
x 3 log x x 3
.
3
9
1 2 e 2 y dxdy
0 0
1 e
1ay
e
2
2 y
dy
=1
P{X1/X2 < 1} =
28.
(a)
2
1a
2 1a a 1 2
1
1 2
1 t
e since e t is the probability that AJ is still in service when MJ arrives, and 1/2 is the
2
conditional probability that MJ then finishes first.
(b) Using that the time at which MJ finishes is gamma with parameters 2, 1 yields the result:
1 3e2 .
29.
(a) If W X 1 X 2 is the sales over the next two weeks, the W is normal with mean 4,400 and
standard deviation
5000 4400
P (W 5000) P
325.27
P{Z 1.8446} .0326
88
30.
Chapter 6
Let X denote Jills score and let Y be Jacks score. Also, let Z denote a standard normal
random variable.
(a) P{Y > X} = P{Y X > 0}
P{Y X > .5}
Y X (160 170) .5 (160 170)
= P
(20) 2 (15) 2
(20) 2 (15)2
2
2
(20) 2 (15)2
(20) (15)
P{Z > .82} .2061
31.
Let X and Y denote, respectively, the number of males and females in the sample that never
eat breakfast. Since
E[X] = 50.4, Var(X) = 37.6992, E[Y] = 47.2, Var(Y) = 36.0608
it follows from the normal approximation to the binomial that is approximately distributed as
a normal random variable with mean 50.4 and variance 37.6992, and that Y is approximately
distributed as a normal random variable with mean 47.2 and variance 36.0608. Let Z be a
standard normal random variable.
(a) P{X + Y 110} = P{X + Y 109.5}
X Y 97.6 109.5 97.6
= P
73.76
73.76
73.76
73.76
32.
6
(a) (1 / 2)6 5 / 16
3
(b)
P ( S4 420) P 4
100
100
= P( Z 2) .0228
Chapter 6
33.
89
(a) e2
(b) 1 e2 2e2 = 1 3e2
The number of typographical errors on each page should approximately be Poisson
distributed and the sum of independent Poisson random variables is also a Poisson
random variable.
34.
(b) 1
4.4
(4.4) / i ! ,
(c)
i 0
6.6
(6.6)i / i !
i 0
36.
37.
38.
(a) P{X = j, Y = i} =
11
, j = 1, , j, i = 1, , j
5 j
(b) P{X = jY = i} =
1
5j
1 / 5 k
k i
1
j
1 / k , 5 j i.
k i
(c) No.
90
40.
Chapter 6
P{Y i, X i}
1
P{ X i}
36 P{ X i}
2
For j < i: P{Y = jX = i} =
36 P{ X i}
For j = i: P{Y = iX = i} =
Hence
i
1=
P{Y j
X i}
j 1
41.
2(i 1)
1
36 P{ X i} 36 P{ X i}
2i 1
and
36
1
ji
2i i
P{Y = jX = i} =
2
ji
2i 1
xe x ( y 1)
= (y + 1)2xex(y+1), 0 < x
(a) fXY(xy) =
x ( y 1)
xe
dx
(b) fYX(yx) =
xe x ( y 1)
xe
x ( y 1)
a/ x
P{XY < a} =
xe
= xexy, 0 < y
dy
x ( y 1)
dydx
0 0
(1 e
)e x dx = 1 ea
fXY(a) = ea , 0 < a
42.
( x 2 y 2 )e x
fYX(yx) =
( x 2 y 2 )e x dx
3
( x 2 y 2 ) , x < y < x
3
4x
3 y 2
( x y 2 )dy
3 x
4x
3
=
( x 2 y y 3 / 3 2 x 3 / 3), x < y < x
4 x3
FYX(yx) =
Chapter 6
43.
91
P{N n }g ( )
P{N n}
= C1ene()s1
= C2e(+1)n+s1
f(n) =
where C1 and C2 do not depend on . But from the preceding we can conclude that the
conditional density is the gamma density with parameters + 1 and n + s. The conditional
expected number of accidents that the insured will have next year is just the expectation of
this distribution, and is thus equal to (n + s)/( + 1).
44.
dx dx dx
1
x1 x 2 x 3
(take a = 0, b = 1)
0 xi 1
i 1, 2, 3
1 1 x3
=3
1 1 x3
dx1dx2 dx3 3
0 x2 x3
(1 x
0
x3 )dx2 dx3
=3
(1 x3 )2
dx3 1 / 2 .
2
0
45.
x
5!
x
x
f X ( 3) ( x )
xe dx xe xe dx
2!2! 0
46.
L 2d
3/4
47.
1/4
48.
3/4
f X ( 3) ( x )dx
5!
x 2 (1 x )2 dx
2!2! 1/4
49.
P{ X
P{ X
a} 1 e5 a
a} (1 e a )5
It is uniform on ( sn 1 ,1)
92
51.
Chapter 6
Start with
f z1 , z2 ( z1 , z2 )
1 x 2 ( y x )2 / w
e
2
51.
4!
f X (1) , X ( 4 ) ( x, y ) = 2 x 2 zdz 2 y , x < y
2! X
= 48xy(y2 x2).
1 a a x
48 xy( y
P(X(4) X(1) a} =
x 2 )dydx
1 1
48 xy( y
x 2 )dydx
1 a 0
52.
f R1 ( r, )
2r
1
, 0 r 1, 0 < 2.
2
Hence, R and are independent with being uniformly distributed on (0, 2) and R having
density fR(r) = 2r, 0 < r < 1.
53.
fR,(r,) = r, 0 < r sin < 1, 0 < r cos < 1, 0 < < /2, 0 < r <
54.
1 1/2
x cos u 2
J= 2
2 z sin u
fu,z(u, z) fX,Y(x, y) =
1 1/2
z sin u 2
2
= cos2 u + sin2 u = 1
2 z cos u
1 z
e . But x2 + y2 = 2z so
2
1 ( x 2 y 2 )/2
e
2
Chapter 6
93
55.
y=
u/v ,x=
(b) fu,v(u, v) =
u
fu(u) =
x
y2
1
y
= 2
x
and
y
vu . Hence,
1
f X ,Y
2v
2vu
vy , u / v
1
1
, u 1, < v < u
2
u
2vu
1
log u , u 1.
u2
dv
1/ u
For v > 1
fV(v) =
2vu
du
1
,v>1
2v 2
du
1
, 0 < v < 1.
2
For v < 1
fV(v) =
2vu
1/2
56.
J=
x 1 1
( v 1) 2
(
x
y
)
y 2 y y 2
u
1 / y x / y2
fu,v (u, v) =
58.
u
uv
,x=
v 1
v 1
(a) u = x + y, v = x/y y =
1
u
, 0 < uv < 1 + v, 0 < u < 1 + v
( v 1)2
y1 = x1 + x2, y2 = e x1 . J =
x1
= e x1 = y2
e
y2
fY1 ,Y2 ( y1 , y2 )
94
59.
Chapter 6
u = x + y, v = x + z, w = y + z z =
vwu
v wu
wvu
,x
,y
2
2
2
1 1 0
J = 1 0 1 = 2
0 1 1
f(u, v, w) =
60.
1
1
k!(n k)!/n!, if
k 1
n 1
j 1
=
0, otherwise
Thus, the joint mass function is symmetric, which proves the result.
61.
i 1
Chapter 6
95
Theoretical Exercises
1.
2.
ri
1
`1
events
r !
i
ri
ri !
1
1 r1
rn e
=
P1 ... pn
r1 !...rn !
n
ri !
1
e
i 1
3.
Pi
( pi ) ri ri !
Throw a needle on a table, ruled with equidistant parallel lines a distance D apart, a large
2L
number of times. Let L, L < D, denote the length of the needle. Now estimate by
fD
where f is the fraction of times the needle intersects one of the lines.
96
5.
Chapter 6
(a) For a > 0
FZ(a) = P{X aY}
a/ y
f X ( x ) fY ( y )dxdy
0 0
(ay ) fY ( y )dy
fZ(a) =
(b)
f X ( x ) fY ( y )dxdy
0 0
(a / y ) fY ( y )dy
fZ(a) =
(a / y )
1
fY ( y )dy
y
If X is exponential with rate and Y is exponential with rate then (a) and (b) reduce to
(a) FZ(a) =
ay
y e y dy
(b) FZ(a) =
e
0
6.
a / y
1 y
e dy
y
FX Y (t )
x yt
f X ,Y ( x, y )dydx
t x
f X ,Y ( x, y )dydx
d
dt
d
dt
t x
t x
f X ,Y ( x, y )dydx
f X ,Y ( x, y )dydx
f X ,Y ( x, t x )dx
Chapter 6
7.
97
f X ( a / c ) e a / c ( a / c )t 1 (t ) .
c
c
Hence, cX is gamma with parameters (t, /c).
fcX(a) =
(b) A chi-squared random variable with 2n degrees of freedom can be regarded as being the
sum of n independent chi-square random variables each with 2 degrees of freedom
(which by Example is equivalent to an exponential random variable with parameter ).
2
Hence by Proposition X 2n
is a gamma random variable with parameters (n, 1/2) and the
result now follows from part (a).
8.
10.
If we let Xi denote the time between the ith and (i + 1)st failure, i = 0, , n 2, then it follows
from Exercise 9 that the Xi are independent exponentials with rate 2. Hence,
n2
the
i 0
amount of time the light can operate is gamma distributed with parameters (n 1, 2).
11.
I=
x1 x2 x3 x4 x5
u1 u2 u3 u4 u5
0 < ui < 1
= u2 du2 ...du5
f(x1) f(x5)dx1dx5
du1 du5
by ui = F(xi), i = 1, , 5
= (1 u32 ) / 2 du3
= [u4 u43 / 3] / 2du4 du5
1
= [u 2 u 4 / 3] / 2du = 2/15
0
98
12.
Chapter 6
Assume that the joint density factors as shown, and let
Ci =
gi ( x )dx, i = 1, , n
Since the n-fold integral of the joint density function is equal to 1, we obtain that
n
1=
i 1
g j (x j ) / C j
i j
f(x1, , xn) =
f
j 1
Xj
(xj )
1 if trial i is a success
. Then
No. Let Xi =
0
f X X 1 ,..., X n m ( x x1 ,..., xn m )
P{x1 ,..., xn m X x}
f X ( x)
P{x1 ,..., xn m }
= cx i (1 x )
x
n m
xi
n m
and so given
14.
p (1 p )i 1 p (1 p )n i 1
1
=
n 1
n 1 2
n2
1 p (1 p )
Chapter 6
15.
99
Let X denote the trial number of the kth success, and let s, s, f, f, s, ..., f be an outcome of the
first n 1 trials that contains a total of k 1 successes and n k failures. Using that X is a
negative binomial random, we have
P ( s, s, f , f , s,..., f , X n )
P[ X n ]
P ( s, s, f , f , s,..., f , s )
n 1 k
nk
k 1 p (1 p )
P ( s, s, f , f , s,..., f X n )
p k (1 p )n k
n 1 k
nk
k 1 p (1 p )
1
n 1
k 1
P{ X k , X Y m}
P{ X Y m}
P{ X k , Y m k }
=
P{ X Y m}
n k
nk n mk
n m k
k p (1 p ) m k p (1 p )
=
2n m
2nm
m p (1 p )
P{X = kX + Y = m} =
n n
k m k
=
2n
m
17.
P(X = n, Y = m) =
P ( X n, Y m X
i ) P( X 2 i )
= e ( 1 2 3 )
min( n ,m )
i0
1ni
3m i 2i
( n 1)! ( m i )! i !
100
18.
Chapter 6
Starting with
P( X i,Y j )
P(Y j )
P( X i, Y j )
q( j i )
P( X i )
p (i j )
we see that
p (i j ) P ( X i )
q( j i ) P (Y j )
20.
P{ X 1 max( X 1 , X 2 , X 3 )} 1 / 3
=
2/3
1/ 2
P{ X 1 X 3 }
P{ X 3 X 1 X 2 } 1 / 3!
= 1/3
P{ X 1 X 3 }
1/ 2
P{ X 1 X 2 X 3 } 1 / 3!
1/ 3
P{ X 2 X 3 }
1/ 2
P{ X 2 min( X 1 , X 2 , X 3 )} 1 / 3
=
2/3
1/ 2
P{ X 2 X 3 }
21.
fWN(wn) =
P{N n W w} fW ( w)
P{N n}
wn
= Cew n ! ew(w)t1
= C1e(+1)wwn+t1
where C and C1 do not depend on w. Hence, given N = n, W is gamma with parameters
(n + t, + 1).
Chapter 6
22.
101
fW X i ,i 1,...,n ( w x1 ,..., xn ) =
f ( x1 ,..., xn w) f w ( w)
f ( x1 ,..., xn )
n
=C
we
wxi w
( w)t 1
i 1
= Ke
23.
xi
1
wn t 1
k i
k i
where the last equality follows as the events that every element in the ith row is greater than
all elements in the jth column excluding Xij is clearly independent of the event that Xij is the
smallest element in row i. Now each size ordering of the n + m 1 elements under
consideration is equally likely and so the probability that the m smallest are the ones in row i
n m 1
is 1
. Hence
m
P{Xij is a saddlepoint} =
1
1 ( m 1)!( n 1)!
( n m 1)!
n m 1 m
m
and so
P{there is a saddlepoint} = P { X ij is a saddlepoint}
i, j
P{ X
ij
is a saddlepoint}
i,j
=
24.
m!n!
(n m 1)!
102
25.
Chapter 6
Let Y = max (X1, , Xn) , Z = min(X1, , Xn)
n
P{Y x} = P{Xi x, i= 1, , n} =
P{ X
x} F n ( x )
P{ X
x} [1 F ( x )]n .
26.
n!
1 2 d
x1 d
1 d
...
xn 3 d xn 2 d xn 1 d
n
= [1 (n 1)d] .
(b) 0
27.
Fx( j ) ( x )
i F ( x)[1 F ( x )]
f X( j) ( x)
n i
i j
i iF
i 1
( x ) f ( x )[1 F ( x )]n i
i j
n i 1
f ( x)
i j
(n i )!(i 1)!F
n!
i 1
( x ) f ( x )[1 F ( x )]n i
i j
(n k )!(k 1)!F
n!
k 1
( x ) f ( x )[1 F ( x )]n k by k = i + 1
k j 1
28.
n!
F j 1 ( x ) f ( x )[1 F ( x )]n j
(n j )!( j 1)!
f X ( n 1) ( x )
(2n 1)! n
x (1 x )n
n!n!
Chapter 6
29.
103
=
31.
n!
F i 1 ( xi ) f ( xi )[ F ( x j ) F ( xi )] j i 1 f ( x j ) [1 F(xj)nj
(i 1)!1!( j i 1)!1!( n j )!
Let X1, , Xn be n independent uniform random variables over (0, a). We will show by
induction on n that
a t n
if t a
st
n 1
if t s
if t s
ns n 1
at
ds
n
a
a
s
which completes the induction. (The above used that f X ( n ) ( s ) n
a
32.
n 1
1 ns n 1
n ).
a
a
104
35.
Chapter 6
The Jacobian of the transformation is
J=
1/ y
0 x / y2
x / y2
u
v
f X ,Y (u, u / v )
u 1 ( u 2 u 2 / v 2 )/2
e
v 2 2
Hence,
fV(u) =
=
=
=
=
2
2
1
u e u (11/ v )/2 du
2
2 v
2
2
1
u e u /2 du , where 2 = v2/(1 + v2)
2
2 v
1 u 2 /2 2
ue
du
v2 0
1 2 y
e dy
0
v2
1
(1 v 2 )