You are on page 1of 6

Summer 2006 STAT 602 Homework 4 Solution

1. (a) The joint p.d.f of X and Y is


fX,Y (x, y) = fY |X=x (y|x) · fX (x) = (3y 2 /x3 ) · [(x3 e−x )/6]
= y 2 e−x /2, 0<y<∞
= 0, elsewhere
(b) The marginal p.d.f for Y is
Z ∞ 2 −x
y e y2
fY (y) = dx = · (−e−x )|∞
y
y 2 2
2 −y
y e
= , 0<y<∞
2
= 0, elsewhere
So the conditional p.d.f of X given Y = y is
fX,Y (x, y) y 2 e−x /2
fX|Y =y (x|y) = =
fY (y) y 2 e−y /2
= e−(x−y) , y<x<∞
= 0, elsewhere
(c) The conditional m.g.f for X given Y = y is
Z ∞ Z ∞
MX|Y =y (t) = EX|Y =y (etX ) = etx · e−(x−y) dx = ey e−(1−t)x dx
y y
 −(1−t)x  ∞ ty
y e e
=e − = ,t < 1
(1 − t) y 1−t
d2 d2
 ty 
2 e
E(X |Y = y) = 2
MX|Y =y (t)|t=0 = 2 |t=0
dt  dt 1−t
ty ty
d ye e
= + 2
(d) dt 2 1ty− t (1 − t)ty
y e (1 − t) − ye (−1) yety (1 − t)2 − 2(1 − t)(−1)ety

= + |t=0
(1 − t)2 (1 − t)4
= y 2 + 2y + 2, y ∈ (0, ∞)

2. (a) The marginal probability distribution of Y is


∞ 
n!y x (pe−1 )y (1 − p)n−y / [y!(n − y)!x!]
P 
fY (y) =
x=0

n! −1 y n−y
X yx
= (pe ) (1 − p)
y!(n − y)! x=0
x!
 
n
=   (pe−1 )y (1 − p)n−y · ey
y
 
n
=   py (1 − y)n−y , y = 0, 1, 2, . . . , n
y
= 0, elsewhere
Since Y ∼ Binomial(n, p), then E(Y ) = np, V ar(Y ) = np(1 − p)

(b) The conditional probability distribution of X given Y = y is

1
Summer 2006 STAT 602 Homework 4 Solution

fx,y (x, y)
fX|Y =y (x|y) =
fY (y)
[n!y x (pe−1 )y (1 − p)n−y ]/[y!(n − y)!x!]
=  
n
  py (1 − p)n−y
y
y x e−y
= , x = 0, 1, 2, . . .
x!
= 0, elsewhere
So X|Y = y ∼ Poisson(y).

X and Y are not independent. Because if X and Y are independent, fX|Y =y (x|y) is independent
of y. Here we know fX|Y =y (x|y) depends on y, so X and Y are dependent.

(c) Since X|Y = y ∼ Poission(y), then E(X|Y = y) = y, V ar(X|Y = y) = y, y = 0, 1, 2, . . . , n.

(d) E(X) = EY (E(X|Y = y)) = EY (Y ) = np, using the result in (a)

V ar(X) = EY (V ar(X|Y = y))+V arY (E(X|Y )) = EY (Y )+V arY (Y ) = np+np(1−p) = np(2−p)

3. (a) The marginal p.d.f of X is




 2/9, x = 0


X  7/18, x = 1

fX (x) = fX,Y (x, y) =
y  7/18, x = 2





0, elsewhere

The marginal p.d.f of Y is





 11/18, y = 0
X 
fY (y) = fX,Y (x, y) = 7/18, y = 1
x



0, elsewhere

(b) The conditional probability distribution for X given Y = y is




 1/11, x=0


FX,Y (x, 0)  4/11, x=1

fX|Y =0 (x|y = 0) = =
fY (0)  6/11,
 x=2




0, elsewhere




 3/7, x=0


FX,Y (x, 1)  3/7, x=1

fX|Y =0 (x|y = 1) = =
fY (1) 
 1/7, x=2




0, elsewhere

2
Summer 2006 STAT 602 Homework 4 Solution

The conditional probability distribution of Y given X = x is



 1/4, y = 0


FX,Y (0, y) 
fY |X=0 (y) = = 3/4, y = 1
fX (0) 


0, elsewhere


 4/7, y = 0


FX,Y (1, y) 
fY |X=1 (y) = = 3/7, y = 1
fX (1) 


0, elsewhere


 6/7, y = 0


FX,Y (2, y) 
fY |X=2 (y) = = 1/7, y = 1
fX (2) 


0, elsewhere

(c) E(X|Y = 0) = 0(1/11) + 1(4/11) + 2(6/11) = 16/11

E(X|Y = 1) = 0(3/7) + 1(3/7) + 2(1/7) = 5/7

E(Y |X = 0) = 0(1/4) + 1(3/4) = 3/4

E(Y |X = 1) = 0(4/7) + 1(3/7) = 3/7

E(Y |X = 2) = 0(6/7) + 1(1/7) = 1/7

4. (a) The marginal p.d.f for X is


Z 1
fX (x) = 8xydy = 4x(y 2 )|1x = 4x(1 − x2 ), 0 < x < 1
x
= 0, elsewhere
The marginal p.d.f for Y is
Z y
fY (y) = 8xydx = 4y 3 , 0 < y < 1
0
= 0, elsewhere
(b) X and Y are not independent random variables because

fX (x)fY (y) = 4x(1 − x2 )4y 3 6= 8xy = fX,Y (x, y), 0 < x < y < 1

5. (a) The marginal p.d.f for X is


Z 1
fX (x) = (x + y)dy = x + 21 , 0 < x < 1
0
= 0, elsewhere
Z 1
fY (y) = (x + y)dx = Y + 21 , 0 < y < 1
0
= 0, elsewhere

3
Summer 2006 STAT 602 Homework 4 Solution

(b) X and Y are not independent random variables because

fX (x)fY (y) = ( 12 + x)( 12 + y) 6= x + y = fX,Y (x, y), 0 < x, y < 1

6. (a) The marginal p.d.f for X is


Z 1
fX (x) = 4xydy = 2x(y 2 )|10 = 2x, 0<x<1
0
= 0, elsewhere
The marginal p.d.f for Y is
Z 1
fY (y) = 4xydy = 2y(x2 )|10 = 2y, 0<y<1
0
= 0, elsewhere
(b) X and Y are independent random variables because

fX (x)fY (y) = (2x)(2y) = 4xy = fX,Y (x, y), 0 < x < y < 1

fX (x)fY (y) = 0 = fX,Y (x, y), elsewhere

X
T 0 1 2
7. (a) 0 0 1 2
Y 1 1 2 3
2 2 3 4
P (T = 0) = P (X = 0 and Y = 0) = P (X = 0) · P (Y = 0) = (0.6)(0.5) = 0.3
P (T = 1) = P (X = 1)P (Y = 0) + P (X = 0)P (Y = 1) = (0.3)(0.5) + (0.6)(0.3) = 0.33
P (T = 2) = P (X = 2)P (Y = 0) + P (X = 1)P (Y = 1) + P (X = 0)P (Y = 2) = (0.1)(0.5) +
(0.3)(0.3) + (0.6)(0.2) = 0.26
P (T = 3) = P (X = 2)P (Y = 1) + P (X = 1)P (Y = 2) = (0.1)(0.3) + (0.3)(0.3) = 0.09
P (T = 4) = P (X = 2)P (Y = 2) = (0.1)(0.2) = 0.02



 0.3, t=0



0.33, t = 1






 0.26, t = 2

(b) P (T = t) =



 0.09, t = 3


0.02, t = 4







 0, elsewhere
P
(c) E(T ) = tP (T = t) = 1.2
tP
E(T 2 ) = t2 P (T = t) = 2.5
t
V ar(T ) = E(T 2 ) − (E(T ))2 = 1.06

4
Summer 2006 STAT 602 Homework 4 Solution

(d) Since X and Y are positively correlated, Cov(X, Y ) = 0.


P
E(T ) = tP (T = t) remains unchanged,
t
however, V ar(T ) = V ar(X + Y ) = V ar(X) + V ar(Y ) + 2Cov(X, Y ) > V ar(X) + V ar(Y )

So T will have a larger variance when X and Y are positively correlated.

8. X ∼ U (0, 1). Let F and f denote, respectively, the c.d.f. and p.d.f. of X.

 0, x < 0



F (x) = x, x ≤ x < 1



1, 1 ≤ x


 1, 0 < x < 1
f (x) =
 0, elsewhere
 
X
Y = ln
1−X
 
x x
Let x ∈ (0, 1), then ∈ (0, ∞). So ln ∈ (−∞, +∞).
1−x 1−x
Thus, the range of Y is (−∞, +∞).

Let G and g denote, respectively, the c.d.f. and p.d.f. of Y . Let y ∈ (−∞, +∞).
 
x
G(y) = P (Y ≤ y) = P (ln ≤ y)
1−x
x
= P( ≤ ey ) = P (X ≤ ey − ey X)
1−x
ey
= P ((1 + ey )X ≤ ey ) = P (X ≤ )
1 + ey
ey ey
= F( )= , −∞ < y < ∞
1 + ey 1 + ey
d d ey
g(y) = G(y) =
dy dy 1 + ey
y d y d
(1 + e ) dy e − ey dy (1 + ey )
=
(1 + ey )2
ey + e2y − e2y ey
= = , −∞ < y < ∞
(1 + ey )2 (1 + ey )2

9. X has p.d.f. fX (x) = λx−(λ+1) I[1,∞] (x), λ > 0

Y = ln X has m.g.f.
t
MY (t) = E(etY ) = E[et ln X ] = E[eln X ]
Z ∞
= E(X t ) = xt λx−(λ+1) dx
Z ∞ 1

=λ xt−λ−1 dx
1

For his integral to converge, we need t − λ < 0,

5
Summer 2006 STAT 602 Homework 4 Solution
x→∞
xt−λ
MY (t) = λ = λ [0 − 1t − λ]
t − λ x=1
 −1
λ 1 1
MY (t) = = 1 = 1 − t ,t < λ
λ−t 1 − λt λ
Recalling hat the m.g.f of Γ(α, β) is (1 − βt)−α , t < 1
β, we find that Y ∼ Γ(1, λ1 ).
1
Equivalently, we can say that Y has an exponential distribution with mean λ.

10. (a) X has a pdf


1 π π
fX (x) = π I[− 2 , 2 ](x)

Y = U (x) = tanx, − π2 < x < π


2

X = W (y) = arctany, −∞ < y < ∞

gY (y) = fx [w(y)]|W 0 (y)|


1 1
= π 1+y 2
1
= π(1+y 2 ) I( − ∞, ∞)(y)

You might also like