You are on page 1of 23

Chapter 6

Problems
2.

8 7
= 14/39,
13 12
8 5
p(0, 1) = p(1, 0) =
= 10/39
13 12
5 4
= 5/39
p(1, 1) =
13 12

(a) p(0, 0) =

(b) p(0, 0, 0) =

8 7 6
= 28/143
13 12 11

8 75
= 70/429
13 12 11
8 5 4
p(0, 1, 1) = p(1, 0, 1) = p(1, 1, 0) =
= 40/429
13 12 11
5 4 3
= 5/143
p(1, 1, 1) =
13 12 11

p(0, 0, 1) = p(0, 1, 0) = p(1, 0, 0) =

3.

(a) p(0, 0) = (10/13)(9/12) = 15/26


p(0, 1) = p(1, 0) = (10/13)(3/12) = 5/26
p(1, 1) = (3/13)(2/12) = 1/26
(b) p(0, 0, 0) = (10/13)(9/12)(8/11) = 60/143
p(0, 0, 1) = p(0, 1, 0) = p(1, 0, 0) = (10/13)(9/12)(3/11) = 45/286
p(i, j, k) = (3/13)(2/12)(10/11) = 5/143

if i + j + k = 2

p(1, 1, 1) = (3/13)(2/12)(1/11) = 1/286


4.

(a) p(0, 0) = (8/13)2, p(0, 1) = p(1, 0) = (5/13)(8/13), p(1, 1) = (5/13)2


(b) p(0, 0, 0) = (8/13)3
p(i, j, k) = (8/13)2(5/13) if i + j + k = 1
p(i, j, k) = (8/13)(5/13)2 if i + j + k = 2

5.

p(0, 0) = (12/13)3(11/12)3
p(0, 1) = p(1, 0) = (12/13)3[1 (11/12)3]
p(1, 1) = (2/13)[(1/13) + (12.13)(1/13)] + (11/13)(2/13)(1/13)

82
Copyright 2014 Pearson Education, Inc.

Chapter 6

83
y

8.

fY(y) = c ( y 2 x 2 )e y dx
y

4 3 y
cy e , 0 < y <
3

fY ( y )dy = 1 c= 1/8 and so fY(y) =

y 3e y
,0<y<
6

1
fX(x) =
( y 2 x 2 )e y dy
8x

1 x
e (1 x ) upon using y 2 e y y 2 e y 2 ye y 2e y
4

9.

(b) fX(x) =

6 2 xy
6
2
x dy (2 x x )
70
2
7

1 x

6 2 xy
15
(c) P{X > Y} =
x dydx
700
2
56

(d) P{Y > 1/2X < 1/2} = P{Y > 1/2, X < 1/2}/P{X < 1/2}
2 1/2

1/2 0
1/2

(2 x

xy

dxdy

2
x )dx

10.

(a) fX(x) = ex , fY(y) = ey, 0 < x < , 0 < y <


P{X < Y} = 1/2
(b) P{X < a} = 1 ea

11.

5!
(.45)2(.15)(.40)2
2! 1!2!

12.

e5 + 5e5 +

52 5 53 5
e e
2!
3!

Copyright 2014 Pearson Education, Inc.

84
14.

Chapter 6
Let X and Y denoted respectively the locations of the ambulance and the accident of the
moment the accident occurs.
P{Y X < a} = P{Y < X < Y + a} + P{X < Y < X + a}
L min( y a , L )

2
= 2
L

dxdy

L a y a
L L

dxdy
dxdy
L2 0 y

L a y

La a
a
a
=1
2 (L a) 2 , 0 < a < L

L
L
L
L

15.

(a) 1 =

f ( x, y )dydx c dydx = cA(R)


( x , y ) R

where A(R) is the area of the region R.


(b) f(x, y) = 1/4, 1 x, y 1
= f(x)f(y)
where f(v) = 1/2, 1 v 1.
(c) P{X 2 + Y 2 1} =

16.

1
4

dydx

= (area of circle)/4 = /4.

(a) A = Ai,
(b) yes
(c) P(A) =
P ( Ai ) = n(1/2)n1

17.

1
since each of the 3 points is equally likely to be the middle one.
3

18.

P{Y X > L/3} =

dydx

y x L /3

L
yL
2
L
0<x<
2
L /6 L
L /2 L

4
= 2
dydx
dydx
L 0 L /2
L /6 x L /3

4 L2 5L2 7 L2


= 7/9
72
L2 12 24

Copyright 2014 Pearson Education, Inc.

Chapter 6

19.

85

(a)

1
dydx
x
11

(b)

(c)

1
2

1
0

dx = 1

dx ln( y ) , 0 < y < 1

1
dy = 1, 0 < y < 1
x

(d) Integrating by parts gives that

1
0

y ln( y )dy 1

1
0

( y ln( y ) y )dy

yielding the result


E[Y] =
20.

y ln( y )dy = 1/4

(a) yes: fX(x) = xex, fY(y) = ey, 0 < x< , 0 < y <
1

(b) no: fX(x) =

f ( x, y )dy 2(1 x ) , 0 < x < 1


x

fY(y) =

f ( x, y )dx 2 y , 0 < y < 1


0

21.

(a) We must show that

f ( x, y )dxdy = 1. Now,

f ( x, y )dxdy

1 y

24 xy dxdy

1
0

12 y (1 y )2 dy

1
0

12( y 2 y 2 y 3 )dy

= 12(1/2 2/3 + 1/4) = 1


(b)

E[X] =

1
0

xf X ( x )dx

1 x

24 xy dydx

1
0

12 x 2 (1 x )2 dx = 2/5

(c) 2/5

Copyright 2014 Pearson Education, Inc.

86
22.

Chapter 6
(a) No, since the joint density does not factor.
(b) fX(x) =

1
0

( x y )dy = x + 1/2, 0 < x < 1.

(c) P{X + Y < 1} =


=
23.

1 x

1
0

( x y )dydx

[ x (1 x ) (1 x )2 / 2]dx = 1/3

(a) yes
fX(x) = 12x(1 x)
fY(y) = 12y

24.

(b) E[X] =

(c) E[Y] =

1
0
1
0

1
0

1
0

ydy 6 x (1 x ), 0 < x < 1

x (1 x )dx = 2y, 0 < y < 1

6 x 2 (1 x )dx = 1/2
2 y 2 dy = 2/3

(d) Var(X) =

(e) Var(Y) =

1
0
1
0

6 x 3 (1 x )dx 1 / 4 = 1/20
2 y 3 dy 4 / 9 = 1/18

P{N = n} = p0n 1 (1 p0 )
(b) P{X = j} = pj/(1 p0)
(c) P{N = n, X = j} = p0n 1 p j

25.

e 1
by the Poisson approximation to the binomial.
i!

26.

(a) FA,B,C(a, b, c) = abc 0 < a, b, c < 1


(b) The roots will be real if B2 4AC. Now
P{AC x} =

x 1

dadc

c x /a
0 a 1
0 c 1

1 x/a

dcda

0 0

dcda
x 0

= x x log x.
Hence, FAC(x) = x x log x and so
fAC(x) = log x , 0 < x < 1

Copyright 2014 Pearson Education, Inc.

Chapter 6

87
1 b2 /4

P{B /4 AC} =
2

log xdxdb
0

b 2 b2

2
log(b / 4)db
4
4

0
log 2 5
=

6
36
1

where the above uses the identity

x 2 log xdx

ay

27.

P{X1/X2 < a} =

x 3 log x x 3
.

3
9

1 2 e 2 y dxdy

0 0

1 e

1ay

e
2

2 y

dy

=1

P{X1/X2 < 1} =

28.

(a)

2
1a

2 1a a 1 2
1

1 2

1 t
e since e t is the probability that AJ is still in service when MJ arrives, and 1/2 is the
2
conditional probability that MJ then finishes first.

(b) Using that the time at which MJ finishes is gamma with parameters 2, 1 yields the result:
1 3e2 .
29.

(a) If W X 1 X 2 is the sales over the next two weeks, the W is normal with mean 4,400 and
standard deviation

2(230)2 325.27 . Hence, with Z being a standard normal, we have

5000 4400
P (W 5000) P

325.27
P{Z 1.8446} .0326

(b) P{ X 2000} P{Z (2000 2200) / 230}


P{Z .87} P{Z .87} .8078
Hence, the probability that weekly sales exceeds 2000 in at least 2 of the next 3 weeks
p 3 3 p 2 (1 p ) where p .8078 .
We have assumed that the weekly sales are independent.

Copyright 2014 Pearson Education, Inc.

88
30.

Chapter 6
Let X denote Jills score and let Y be Jacks score. Also, let Z denote a standard normal
random variable.
(a) P{Y > X} = P{Y X > 0}
P{Y X > .5}
Y X (160 170) .5 (160 170)
= P

(20) 2 (15) 2
(20) 2 (15)2

P{Z > .42} .3372


(b) P{X + Y > 350} = P{X + Y > 350.5}
X Y 330
20.5
= P

2
2
(20) 2 (15)2
(20) (15)
P{Z > .82} .2061

31.

Let X and Y denote, respectively, the number of males and females in the sample that never
eat breakfast. Since
E[X] = 50.4, Var(X) = 37.6992, E[Y] = 47.2, Var(Y) = 36.0608
it follows from the normal approximation to the binomial that is approximately distributed as
a normal random variable with mean 50.4 and variance 37.6992, and that Y is approximately
distributed as a normal random variable with mean 47.2 and variance 36.0608. Let Z be a
standard normal random variable.
(a) P{X + Y 110} = P{X + Y 109.5}
X Y 97.6 109.5 97.6
= P

73.76
73.76

P{Z > 1.3856} .0829


(b) P{Y X} = P{Y X .5}
Y X ( 3.2) .5 ( 3.2)
= P

73.76
73.76

P{Z .3144} .3766

32.

6
(a) (1 / 2)6 5 / 16
3

(b)

S 400 420 400

P ( S4 420) P 4

100
100
= P( Z 2) .0228

Copyright 2014 Pearson Education, Inc.

Chapter 6
33.

89

(a) e2
(b) 1 e2 2e2 = 1 3e2
The number of typographical errors on each page should approximately be Poisson
distributed and the sum of independent Poisson random variables is also a Poisson
random variable.

34.

(a) 1 e2.2 2.2e2.2 e2.2(2.2)2/2!


4

(b) 1

4.4

(4.4) / i ! ,

(c)

i 0

6.6

(6.6)i / i !

i 0

The reasoning is the same as in Problem 26.


35.

(a) P{X1 = 1X2 = 1} = 5/13 = 1 P{X1 = 0X2 = 1}


(b) same as in (a)

36.

(a) P{Y1 = 1Y2 = 1} = 2/12 = 1 P{Y1 = 0Y2 = 1}


(b) P{Y1 = 1Y2 = 0} = 3/12 = 1 P{Y1 = 0Y2 = 0}

37.

(a) P{Y1 = 1Y2 = 1} = p(1, 1)/[1 (12/13)3] = 1 P{Y1 = 0Y2 = 1}


(b) P{Y1 = 1Y2 = 0} = p(1, 0)/(12/13)3 = 1 P{Y1 = 0Y2 = 0}
where p(1, 1) and p(1, 0) are given in the solution to Problem 5.

38.

(a) P{X = j, Y = i} =

11
, j = 1, , j, i = 1, , j
5 j

(b) P{X = jY = i} =

1
5j

1 / 5 k
k i

1
j

1 / k , 5 j i.
k i

(c) No.

Copyright 2014 Pearson Education, Inc.

90

40.

Chapter 6
P{Y i, X i}
1

P{ X i}
36 P{ X i}
2
For j < i: P{Y = jX = i} =
36 P{ X i}

For j = i: P{Y = iX = i} =

Hence
i

1=

P{Y j

X i}

j 1

and so, P{X = i} =

41.

2(i 1)
1

36 P{ X i} 36 P{ X i}

2i 1
and
36

1
ji
2i i
P{Y = jX = i} =
2
ji
2i 1
xe x ( y 1)
= (y + 1)2xex(y+1), 0 < x
(a) fXY(xy) =
x ( y 1)
xe
dx

(b) fYX(yx) =

xe x ( y 1)

xe

x ( y 1)

a/ x

P{XY < a} =

xe

= xexy, 0 < y

dy

x ( y 1)

dydx

0 0

(1 e

)e x dx = 1 ea

fXY(a) = ea , 0 < a
42.

( x 2 y 2 )e x

fYX(yx) =

( x 2 y 2 )e x dx

3
( x 2 y 2 ) , x < y < x
3
4x

3 y 2
( x y 2 )dy
3 x
4x
3
=
( x 2 y y 3 / 3 2 x 3 / 3), x < y < x
4 x3

FYX(yx) =

Copyright 2014 Pearson Education, Inc.

Chapter 6

43.

91
P{N n }g ( )
P{N n}
= C1ene()s1
= C2e(+1)n+s1

f(n) =

where C1 and C2 do not depend on . But from the preceding we can conclude that the
conditional density is the gamma density with parameters + 1 and n + s. The conditional
expected number of accidents that the insured will have next year is just the expectation of
this distribution, and is thus equal to (n + s)/( + 1).
44.

P{X1 > X2 + X3} + P{X2 > X1 + X3} + P{X3 > X1 + X2}


= 3P{X1 > X2 + X3}
3

dx dx dx
1

x1 x 2 x 3

(take a = 0, b = 1)

0 xi 1
i 1, 2, 3
1 1 x3

=3

1 1 x3

dx1dx2 dx3 3

0 x2 x3

(1 x
0

x3 )dx2 dx3

=3

(1 x3 )2
dx3 1 / 2 .
2
0

45.

x
5!
x
x
f X ( 3) ( x )
xe dx xe xe dx
2!2! 0

= 30(x + 1)2e2xxex[1 ex(x + 1)]2

46.

L 2d

3/4

47.

1/4

48.

3/4

f X ( 3) ( x )dx

5!
x 2 (1 x )2 dx
2!2! 1/4

(a) P{min Xi a} = 1 P{min Xi > a} = 1


(b) P{max Xi a} =

49.

P{ X

P{ X

a} 1 e5 a

a} (1 e a )5

It is uniform on ( sn 1 ,1)

Copyright 2014 Pearson Education, Inc.

92
51.

Chapter 6
Start with
f z1 , z2 ( z1 , z2 )

1 ( z12 z22 )/2


e
2

Making the transformation using that its Jacobian is 1 yields that


f X ,Y ( x, y ) f z1 , z2 ( x, y x )

1 x 2 ( y x )2 / w
e
2

51.

4!
f X (1) , X ( 4 ) ( x, y ) = 2 x 2 zdz 2 y , x < y
2! X

= 48xy(y2 x2).
1 a a x

48 xy( y

P(X(4) X(1) a} =

x 2 )dydx

1 1

48 xy( y

x 2 )dydx

1 a 0

52.

f R1 ( r, )

2r

1
, 0 r 1, 0 < 2.
2

Hence, R and are independent with being uniformly distributed on (0, 2) and R having
density fR(r) = 2r, 0 < r < 1.

53.

fR,(r,) = r, 0 < r sin < 1, 0 < r cos < 1, 0 < < /2, 0 < r <

54.

1 1/2
x cos u 2
J= 2
2 z sin u

fu,z(u, z) fX,Y(x, y) =

1 1/2
z sin u 2
2
= cos2 u + sin2 u = 1
2 z cos u

1 z
e . But x2 + y2 = 2z so
2
1 ( x 2 y 2 )/2
e
2

Copyright 2014 Pearson Education, Inc.

Chapter 6

93

55.

(a) If u = xy, v = xy, then J =

y=

u/v ,x=

(b) fu,v(u, v) =
u

fu(u) =

x
y2

1
y

= 2

x
and
y

vu . Hence,

1
f X ,Y
2v

2vu

vy , u / v

1
1
, u 1, < v < u
2
u
2vu

1
log u , u 1.
u2

dv

1/ u

For v > 1

fV(v) =

2vu

du

1
,v>1
2v 2

du

1
, 0 < v < 1.
2

For v < 1

fV(v) =

2vu

1/2

56.

J=

x 1 1
( v 1) 2
(
x
y
)

y 2 y y 2
u
1 / y x / y2

fu,v (u, v) =

58.

u
uv
,x=
v 1
v 1

(a) u = x + y, v = x/y y =
1

u
, 0 < uv < 1 + v, 0 < u < 1 + v
( v 1)2

y1 = x1 + x2, y2 = e x1 . J =

x1

= e x1 = y2

x1 = log y2, x2 = y1 log y2


1
e log y2 e ( y1 log y2 )
y2
1 2 y1
, 1 y2, y1 log y2

e
y2

fY1 ,Y2 ( y1 , y2 )

Copyright 2014 Pearson Education, Inc.

94

59.

Chapter 6

u = x + y, v = x + z, w = y + z z =

vwu
v wu
wvu
,x
,y
2
2
2

1 1 0
J = 1 0 1 = 2
0 1 1

f(u, v, w) =
60.

1
1

exp (u v w) , u + v > w, u + w > v, v + w + u


2
2

P(Yj = ij, j = 1, , k + 1} = P{Yj = ij, j = 1, , k} P(Yk+1 = ik+1Yj = ij, j = 1, , k}


k
k !( n k )!
=
P{n 1 Yi ik 1 Y j i j , j 1,..., k }
n!
i 1

k!(n k)!/n!, if

k 1

n 1

j 1

=
0, otherwise
Thus, the joint mass function is symmetric, which proves the result.
61.

The joint mass function is


n
P{Xi = xi, i = 1, , n} = 1/ , xi {0, 1}, i = 1, , n,
k
As this is symmetric in x1, , xn the result follows.

Copyright 2014 Pearson Education, Inc.

i 1

Chapter 6

95

Theoretical Exercises
1.

P{X a2, Y b2} = P{a1 < X a2, b1 < Y b2}


+ P{X a1, b1 < Y b2}
+ P{a1 < X a2, Y b1}
+ P{X a1, Y b1}.
The above following as the left hand event is the union of the 4 mutually exclusive right hand
events. Also,
P{X a1, Y b2} = P{X a1, b1 < Y b2 }
+ P{X a1, Y b1}
and similarly,
P{X a2, Y b1} = P{a1 X a2, < Y b1 }
+ P{X a1, Y b1}.
Hence, from the above
F(a2, b2) = P{a1 < X a2, b1 < Y b2} + F(a1, b2) F(a1, b1)
+ F(a2, b1) F(a1, b1) + F(a1, b1).

2.

Let Xi denote the number of type i events, i= 1, , n.

P{X1 = r1, , Xn = rn} = P X 1 r1 ,..., X n rn

ri
1

`1

events

r !
i

ri

ri !

1
1 r1
rn e
=
P1 ... pn
r1 !...rn !
n
ri !
1

e
i 1

3.

Pi

( pi ) ri ri !

Throw a needle on a table, ruled with equidistant parallel lines a distance D apart, a large
2L
number of times. Let L, L < D, denote the length of the needle. Now estimate by
fD
where f is the fraction of times the needle intersects one of the lines.

Copyright 2014 Pearson Education, Inc.

96
5.

Chapter 6
(a) For a > 0
FZ(a) = P{X aY}
a/ y

f X ( x ) fY ( y )dxdy

0 0

(ay ) fY ( y )dy

fZ(a) =

(ay ) yfY ( y )dy

(b)

FZ(a) = P{XY < a}


a/ y

f X ( x ) fY ( y )dxdy

0 0

(a / y ) fY ( y )dy

fZ(a) =

(a / y )

1
fY ( y )dy
y

If X is exponential with rate and Y is exponential with rate then (a) and (b) reduce to

(a) FZ(a) =

ay

y e y dy

(b) FZ(a) =

e
0

6.

a / y

1 y
e dy
y
FX Y (t )

x yt

f X ,Y ( x, y )dydx

t x

f X ,Y ( x, y )dydx

Differentiation yields that


FX Y (t )

d
dt

d
dt

t x

t x

f X ,Y ( x, y )dydx
f X ,Y ( x, y )dydx

f X ,Y ( x, t x )dx

Copyright 2014 Pearson Education, Inc.

Chapter 6
7.

97

(a) P{cX a} = P{X a/c} and differentiation yields


1

f X ( a / c ) e a / c ( a / c )t 1 (t ) .
c
c
Hence, cX is gamma with parameters (t, /c).

fcX(a) =

(b) A chi-squared random variable with 2n degrees of freedom can be regarded as being the
sum of n independent chi-square random variables each with 2 degrees of freedom
(which by Example is equivalent to an exponential random variable with parameter ).
2
Hence by Proposition X 2n
is a gamma random variable with parameters (n, 1/2) and the
result now follows from part (a).
8.

(a) P{W t} = 1 P{W > t} = 1 P{X > t, Y > t} = 1 [1 FX(t)] [1 FY(t)]


(b) fW(t) = fX(t)[1 FY(t)] + fY(t) [1 FX(t)]
Dividing by [1 FX(t)][1 FY(t)] now yields

W(t) = fX(t)/[1 FX(t)] + fY(t)/[1 FY(t)] = X(t) + Y(t)


9.

P{min(X1, , Xn) > t} = P{X1 > t, , Xn > t}


= etet = ent
thus showing that the minimum is exponential with rate n.

10.

If we let Xi denote the time between the ith and (i + 1)st failure, i = 0, , n 2, then it follows
from Exercise 9 that the Xi are independent exponentials with rate 2. Hence,

n2

the

i 0

amount of time the light can operate is gamma distributed with parameters (n 1, 2).
11.

I=

x1 x2 x3 x4 x5

u1 u2 u3 u4 u5
0 < ui < 1
= u2 du2 ...du5

f(x1) f(x5)dx1dx5
du1 du5

by ui = F(xi), i = 1, , 5

= (1 u32 ) / 2 du3
= [u4 u43 / 3] / 2du4 du5
1

= [u 2 u 4 / 3] / 2du = 2/15
0

Copyright 2014 Pearson Education, Inc.

98
12.

Chapter 6
Assume that the joint density factors as shown, and let

Ci =

gi ( x )dx, i = 1, , n

Since the n-fold integral of the joint density function is equal to 1, we obtain that
n

1=

i 1

Integrating the joint density over all xi except xj gives that


fX j (x j ) g j (x j )

g j (x j ) / C j

i j

It follows from the preceding that


n

f(x1, , xn) =

f
j 1

Xj

(xj )

which shows that the random variables are independent.


13.

1 if trial i is a success
. Then
No. Let Xi =
0
f X X 1 ,..., X n m ( x x1 ,..., xn m )

P{x1 ,..., xn m X x}
f X ( x)
P{x1 ,..., xn m }

= cx i (1 x )
x

n m

xi

n m

and so given

= n the conditional density is still beta with parameters n + 1, m + 1.

14.

P{X = iX + Y = n} = P{X = i, Y = n i}/P{X + Y = n}


=

p (1 p )i 1 p (1 p )n i 1
1
=
n 1
n 1 2
n2
1 p (1 p )

Copyright 2014 Pearson Education, Inc.

Chapter 6
15.

99

Let X denote the trial number of the kth success, and let s, s, f, f, s, ..., f be an outcome of the
first n 1 trials that contains a total of k 1 successes and n k failures. Using that X is a
negative binomial random, we have
P ( s, s, f , f , s,..., f , X n )
P[ X n ]
P ( s, s, f , f , s,..., f , s )

n 1 k
nk
k 1 p (1 p )

P ( s, s, f , f , s,..., f X n )

p k (1 p )n k
n 1 k
nk
k 1 p (1 p )
1
n 1
k 1

and the result if proven.


16.

P{ X k , X Y m}
P{ X Y m}
P{ X k , Y m k }
=
P{ X Y m}
n k
nk n mk
n m k
k p (1 p ) m k p (1 p )
=
2n m
2nm
m p (1 p )

P{X = kX + Y = m} =

n n
k m k
=
2n
m

17.

P(X = n, Y = m) =

P ( X n, Y m X

i ) P( X 2 i )

= e ( 1 2 3 )

min( n ,m )

i0

1ni

3m i 2i

( n 1)! ( m i )! i !

Copyright 2014 Pearson Education, Inc.

100
18.

Chapter 6
Starting with

P( X i,Y j )
P(Y j )
P( X i, Y j )
q( j i )
P( X i )
p (i j )

we see that
p (i j ) P ( X i )

q( j i ) P (Y j )

and the result follows.


19.

20.

(a) P{X1 > X2X1 > X3} =

P{ X 1 max( X 1 , X 2 , X 3 )} 1 / 3
=
2/3
1/ 2
P{ X 1 X 3 }

(b) P{X1 > X2X1 < X3} =

P{ X 3 X 1 X 2 } 1 / 3!
= 1/3

P{ X 1 X 3 }
1/ 2

(c) P{X1 > X2X2 > X3} =

P{ X 1 X 2 X 3 } 1 / 3!

1/ 3
P{ X 2 X 3 }
1/ 2

(d) P{X1 > X2X2 < X3} =

P{ X 2 min( X 1 , X 2 , X 3 )} 1 / 3
=
2/3
1/ 2
P{ X 2 X 3 }

P{U > sU > a} = P{U > s}/P{U > a}


1 s
,a<s<1
=
1 a
P{U < sU < a} = P{U < s}/P{U < a}
= s/a, 0 < s < a
Hence, UU > a is uniform on (a, 1), whereas UU < a is uniform over (0, a).

21.

fWN(wn) =

P{N n W w} fW ( w)

P{N n}
wn
= Cew n ! ew(w)t1

= C1e(+1)wwn+t1
where C and C1 do not depend on w. Hence, given N = n, W is gamma with parameters
(n + t, + 1).

Copyright 2014 Pearson Education, Inc.

Chapter 6

22.

101

fW X i ,i 1,...,n ( w x1 ,..., xn ) =

f ( x1 ,..., xn w) f w ( w)
f ( x1 ,..., xn )
n

=C

we

wxi w

( w)t 1

i 1

= Ke
23.

xi
1

wn t 1

Let Xij denote the element in row i, column j.


P{Xij is s saddle point}

= P min X ik max X kj , X ij min X ik


k 1,...,m

k i

= P min X ik max X kj P X ij min X ik


k

k i

where the last equality follows as the events that every element in the ith row is greater than
all elements in the jth column excluding Xij is clearly independent of the event that Xij is the
smallest element in row i. Now each size ordering of the n + m 1 elements under
consideration is equally likely and so the probability that the m smallest are the ones in row i
n m 1
is 1
. Hence
m
P{Xij is a saddlepoint} =

1
1 ( m 1)!( n 1)!

( n m 1)!
n m 1 m
m

and so
P{there is a saddlepoint} = P { X ij is a saddlepoint}
i, j

P{ X

ij

is a saddlepoint}

i,j

=
24.

m!n!
(n m 1)!

For 0 < x < 1


P([X] = n, X [X] < x) = P(n < X < n + x) = en e(n + x) = en(1 ex)
Because the joint distribution factors, they are independent. [X] + 1 has a geometric
distribution with parameter p = 1 e and x [X] is distributed as an exponential with rate
conditioned to be less than 1.

Copyright 2014 Pearson Education, Inc.

102
25.

Chapter 6
Let Y = max (X1, , Xn) , Z = min(X1, , Xn)
n

P{Y x} = P{Xi x, i= 1, , n} =

P{ X

x} F n ( x )

P{Z > x} = P{Xi > x, i = 1, , n} =

P{ X

x} [1 F ( x )]n .

26.

(a) Let d = D/L. Then the desired probability is


1 ( n 1) d 1 ( n 2) d

n!

1 2 d

x1 d

1 d

...

dxn dxn 1 ...dx2 dx1

xn 3 d xn 2 d xn 1 d
n

= [1 (n 1)d] .
(b) 0
27.

Fx( j ) ( x )

i F ( x)[1 F ( x )]

f X( j) ( x)

n i

i j

i iF

i 1

( x ) f ( x )[1 F ( x )]n i

i j

i F ( x)(n i )[1 F ( x)]

n i 1

f ( x)

i j

(n i )!(i 1)!F

n!

i 1

( x ) f ( x )[1 F ( x )]n i

i j

(n k )!(k 1)!F
n!

k 1

( x ) f ( x )[1 F ( x )]n k by k = i + 1

k j 1

28.

n!
F j 1 ( x ) f ( x )[1 F ( x )]n j
(n j )!( j 1)!

f X ( n 1) ( x )

(2n 1)! n
x (1 x )n
n!n!

Copyright 2014 Pearson Education, Inc.

Chapter 6
29.

103

In order for X(i) = xi , X(j) = xj , i < j , we must have


(i) i 1 of the Xs less than xi
(ii) 1 of the Xs equal to xi
(iii) j i 1 of the Xs between xi and xj
(iv) 1 of the Xs equal to xj
(v) n j of the Xs greater than xj
Hence,
f x( i ) , X ( j ) ( xi , x j )

=
31.

n!
F i 1 ( xi ) f ( xi )[ F ( x j ) F ( xi )] j i 1 f ( x j ) [1 F(xj)nj
(i 1)!1!( j i 1)!1!( n j )!

Let X1, , Xn be n independent uniform random variables over (0, a). We will show by
induction on n that
a t n
if t a

P{X(k) X(k1) > t} = a


if t a
0

It is immediate when n = 1 so assume for n 1. In the n case, consider

P{X(k) X(k1) > tX(n) = s}.


Now given X(n) = s, X(1) , , X(n1) are distributed as the order statistics of a set of n 1
uniform (0, s) random variables. Hence, by the induction hypothesis
s t n 1

P{X(k) X(k1) > tX(n) = s} = s


0

and thus, for t < a,


a

P{X(k) X(k1) > t =

st

n 1

if t s
if t s

ns n 1
at
ds
n
a
a

s
which completes the induction. (The above used that f X ( n ) ( s ) n
a

32.

n 1

1 ns n 1
n ).
a
a

(a) P{X > X(n)} = P{X is largest of n + 1} = 1/(n + 1)


(b) P{X > X(1)} = P{X is not smallest of n + 1} = 1 1/(n + 1) = n/(n + 1)
(c) This is the probability that X is either the (i + 1)st or (i + 2)nd or jth smallest of the n + 1
random variables, which is clearly equal to (j 1)/(n + 1).

Copyright 2014 Pearson Education, Inc.

104
35.

Chapter 6
The Jacobian of the transformation is
J=

1/ y

0 x / y2

x / y2

Hence, J y 2 / x . Therefore, as the solution of the equations u = x, v = x/y is x= u, y =


u/v, we see that
fu,v(u, v) =

u
v

f X ,Y (u, u / v )

u 1 ( u 2 u 2 / v 2 )/2
e
v 2 2

Hence,
fV(u) =
=
=
=
=

2
2
1
u e u (11/ v )/2 du
2
2 v

2
2
1
u e u /2 du , where 2 = v2/(1 + v2)
2
2 v
1 u 2 /2 2
ue
du
v2 0
1 2 y

e dy
0
v2
1
(1 v 2 )

Copyright 2014 Pearson Education, Inc.

You might also like