You are on page 1of 8

1

Chapter 9

91. Since

1 (xi )2
f (xi ) = exp ,
2 2 2

we have
5
Y
f (x1 , x2 , . . . , x5 ) = f (xi )
i=1
Y5
1 (xi )2
= exp
i=1
2 2 2
5/2 5
1 1 X 2
= exp 2 (xi )
2 2 2 i=1

92. Since

f (xi ) = exi ,

we have
n
Y
f (x1 , x2 , . . . , xn ) = f (xi )
i=1
Yn
= exi
i=1
n
X
n
= exp xi
i=1

93. Since f (xi ) = 1, we have


4
Y
f (x1 , x2 , x3 , x4 ) = f (xi ) = 1
i=1
2

94. The joint probability function for X1 and X2 is



N M M
0 2
pX1 ,X2 (0, 0) =
N
2

N M M
1 1
pX1 ,X2 (0, 1) =
N
2
2

N M M
1 1
pX1 ,X2 (1, 0) =
N
2
2

N M M
2 0
pX1 ,X2 (1, 1) =
N
2

Of course,
1
X
pX1 (x1 ) = pX1 ,X2 (x1 , x2 ) and
x2 =0
1
X
pX2 (x2 ) = pX1 ,X2 (x1 , x2 )
x1 =0

So pX1 (0) = M/N , pX1 (1) = 1 (M/N ), pX2 (0) = M/N , pX2 (1) = 1 (M/N ).

Thus, X1 and X2 are not independent since

pX1 ,X2 (0, 0) 6= pX1 (0)pX2 (0)

95. N (, 2 /n) = N (5, 0.00125)



96. / n = 0.1/ 8 = 0.0353
3


97. Use estimated standard error S/ n.

98. N (5, 0.22)

1 X
99. The standard error of X 2 is
s r
12 22 (1.5)2 (2.0)2
+ = + = 0.473
n1 n2 25 30

910. Y = X 1 X 2 is a linear combination of the 55 variables Xij , i = 1, j = 1, 2, . . . , 25,


i = 2, j = 1, 2, . . . , 30. As such, we would expect Y to be very nearly normal with
mean Y = 0.5 and variance (0.473)2 = 0.223.

911. N (0, 1)

912. N (
p, p(1 p)/n)
p p
913. se(
p) = p(1 p)/n, se(
b p) = p(1 p)/n,

914.
Z
tX 1
MX (t) = E(e ) = etx x(n/2)1 ex/2 dx
2n/2 (n/2)
Z0
1
= n/2 x(n/2)1 ex[(1/2)t] dx
2 (n/2) 0

This integral converges if 1/2 > t.

Let u = x[(1/2) t]. Then dx = [(1/2) t]1 du. Thus,


Z
1 u(n/2)1 1
MX (t) = n/2 (n/2)1
eu du
2 (n/2) 0 [(1/2) t] [(1/2) t]
Z
1
= n/2 u(n/2)1 eu du
2 (n/2)[(1 2t)/2]n/2 0
1
= , t < 1/2,
(1 2t)n/2
Z
since (n/2) = u(n/2)1 eu du.
0
4

915. First of all,

MX0 (t) = n(1 2t)(n/2)1


MX00 (t) = n(n + 2)(1 2t)(n/2)2

Then

E(X) = MX0 (0) = n


E(X 2 ) = MX00 (0) = n(n + 2)
V (X) = E(X 2 ) [E(X)]2 = 2n

p p
916. Let T = Z/ 2n /n = Z n/2n . Now
p
E(T ) = E(Z)E( n/2n ) = 0, because E(Z) = 0.

V (T ) = E(T 2 ), because E(T ) = 0. Thus,

V (T ) = E[Z 2 (n/2n )] = E(Z 2 )E(n/2n ).

Note that E(Z 2 ) = V (Z) = 1, so that

V (T ) = E(n/2n )
Z
(n/s)
= n/2
s(n/2)1 es/2 ds
0 2 (n/2)
Z
n
= n/2 s(n/2)2 es/2 ds
2 (n/2) 0
Z
n
= (n/2)1 (2u)(n/2)2 eu du
2 (n/2) 0
n( n2 1)
= , if n > 2
2( n2 )
n( n2 1)
= , if n > 2
2( n2 1)( n2 1)
n
= , if n > 2
n2
5

917. E(Fm,n ) = E[(2m /m)/(2n /n)] = E(2m /m)E(n/2n ).

E(2m /m) = (1/m)E(2m ) = 1.

From Problem 916, we have E(n/2n ) = n/(n 2).

Therefore, E(Fm,n ) = n/(n 2), if n > 2.

To find V (Fm,n ), let X 2m and Y 2n . Then

2
E(Fm,n ) = (n/m)2 E(X 2 )E(1/Y 2 ).

Since E(X 2 ) = V (X) + [E(X)]2 and X 2m , we have E(X 2 ) = 2m + m2 . Now

Z
2 (1/y 2 )
E(1/Y ) = n/2 (n/2)
y (n/2)1 ey/2 dy
0 2
Z
1
= n/2 y (n/2)3 ey/2 dy
2 (n/2) 0
Z
1
= n/2 2(2u)(n/2)3 eu du
2 (n/2) 0
1
= , if n > 4
(n 2)(n 4)
Thus,
2
2 2 2 n
V (Fm,n ) = (n/m) E(X )E(1/Y )
n2
n2 (2m + m2 ) n2 2n2 (m + n 2)
= =
m2 (n 2)(n 4) (n 2)2 m(n 2)2 (n 4)

918. X(1) is greater than t if and only if every observation is greater than t. Then

P (X(1) > t) = P (X1 > t, X2 > t, . . . , Xn > t)


= P (X1 > t)P (X2 > t) P (Xn > t)
= P (X > t)P (X > t) P (X > t)
= [1 F (t)]n
6

So FX(1) (t) = 1 P (X(1) > t) = 1 [1 F (t)]n .

If X is continuous, then so is X(1) ; so


0
fX(1) (t) = FX(1) (t) = n[1 F (t)]n1 f (t)

Similarly,

FX(n) (t) = P (X(n) t)


= P (X1 t, X2 t, . . . , Xn t)
= P (X1 t)P (X2 t) P (Xn t)
= P (X t)P (X t) P (X t)
= [F (t)]n

Since X(n) is continuous,


0
fX(n) (t) = FX(n) (t) = n[F (t)]n1 f (t)

919.

0 t<0
F (t) = 1p 0t<1

1 t1

Then

P (X(n) = 1) = FX(n) (1) FX(n) (0) = [F (1)]n [F (0)]n = 1 (1 p)n

P (X(1) = 0) = 1 [1 F (0)]n = 1 [1 (1 p)]n = 1 pn

920.
n1
t 1 (x )2
fX(1) (t) = n 1 exp
2 2 2
n1
t 1 (x )2
fX(n) (t) = n exp
2 2 2
7

921. f (t) = et , t>0

F (t) = 1 et

FX(1) (t) = 1 [1 F (t)]n = 1 [1 (1 et )]n = 1 ent

fX(1) (t) = nent , t>0

FX(n) (t) = [F (t)]n = (1 et )n

fX(n) (t) = n(1 et )n1 et , t>0

922. fX(n) (X(n) ) = n[F (X(n) )]n1 f (X(n) )

Treat F (X(n) ) as a random variable giving the fraction of objects in the population
having values of X X(n) .

Let Y = F (X(n) ). Then dy = f (X(n) )dx(n) , and thus f (y) = ny n1 , 0 y 1.

This gives
Z 1
n
E(Y ) = ny n dy = .
0 n+1

Similarly, fX(1) (X(1) ) = n[1 F (X(1) )]n1 f (X(1) )

Treat F (X(1) ) as a random variable giving the fraction of objects in the population
having values of X X(1) .

Let Y = F (X(1) ). Then dy = f (X(1) )dx(1) , and thus f (y) = n(1 y)n1 , 0 y 1.

This gives
Z 1
E(Y ) = ny(1 y)n1 dy
0
8

The family of Beta distributions is defined by p.d.f.s of the form



[(r, s)]1 xr1 (1 x)s1 0 < x < 1
g(x) =
0 otherwise

where (r, s) = (r)(s)/(r + s).

Thus,
Z 1
E(Y ) = n y(1 y)n1 dy = n(2, n)
0
n(2)(n) n!1! 1
= = =
(n + 2) (n + 1)! n+1

923. (a) 2.73


(b) 11.34
(c) 34.17
(d) 20.48

924. (a) 2.228


(b) 0.687
(c) 1.813

925. (a) 1.63


(b) 2.85
(c) 0.241
(d) 0.588

You might also like