You are on page 1of 7

1.

Here X, Y ∼ Geometric(p),

P (X = i) = q k p, k = 0, 1, 2, . . .

and
P (Y = j) = q j p, j = 0, 1, 2, . . .
where q = 1 − p.

X
P (X − Y = i) = P (X − Y = i|Y = j)P (Y = j)
j=0
X∞
= P (X = i + j)P (Y = j), since X and Y are independent
j=0
X∞
= q i+j p.q j p
j=0

X
= q i p2 q 2j
j=1
i
qp 2
= , i = 0, ±1, ±2, . . .
1 − q2

2.

P (Z ≤ z) = P (max{X, Y } ≤ z)
= P (x ≤ z)P (Y ≤ z)
= FX (z)FY (z)
P (W ≤ w) = P (min{X, y} ≤ w)
= 1 − P (min{X, Y } > w)
= 1 − P (X > w)P (Y > w)
= 1 − (1 − FX (w))(1 − FY (w))

1
3.

X
P (X + Y = k) = P (X + Y = k|Y = j)P (Y = j)
j=0
k
X
= P (X = k − j)P (Y = j) since X and Y are independent
j=0
k
X λ1k−j −λ2 λj2
= e−λ1 e
(k − j)! j!
j=0
k
e−(λ1 +λ2 ) X k!
= λk−j λj2
k! (k − j)!j! 1
j=0

e−(λ1 +λ2 )
= (λ1 + λ2 )k
k!

4. (a) Here N ∼ Geometric(1/3.2/5 = 2/15)


with pmf P (N = k) = (1 − 2/15)k .2/15, k = 0, 1, 2, . . .
and also X|N = n ∼ bionomial(n, 1/3)

E(X|N ) = N/3 ∵ binomial distribution with parameters n and p, have mean n.p

E(X + N ) = E(E(X + N )|N )


= E(N/3 + N )
= 4/3E(N )
= 4/3.13/2 ∵ E(N ) = 13/2

(b)

Cov(X, N ) = E(XN ) − E(X)E(N )


Now, E(XN ) = E(E(XN |N ))
= E(N 2 /3) = 30.33 (∵ E(N 2 ) = V ar(N ) + (E(N ))
and E(X) = E(E(X|N )) = 13/6
∴ Cov(X, N ) = 16.24

(c)

note that Y = N − X
Cov(X, Y ) = Cov(X, N − X) = Cov(X, N ) − V ar(X) = 9.38(checkit!)
where V ar(X) = E(V ar(X|N )) + V ar(E(X|N )) = 6.86

2
5. Let
1 x − µ1 2 x − µ1 y − µ2 y − µ2 2
Q(x, y) = 2
[( ) − 2ρ( )( )+( ) ]
1−ρ σ1 σ1 σ2 σ2
(a) marginal density of X is given by
Z ∞
fX (x) = f (x, y)dy
−∞

y − µ2 x − µ1 2 x − µ1 2
(1 − ρ)2 Q(x, y) = ( −ρ ) + (1 − ρ2 )( )
σ2 σ1 σ1
y − [µ2 + ρ(σ2 /σ1 )(x − µ1 )] 2 x − µ1 2
= { } + (1 − ρ2 )( )
σ2 σ1

(x−µ1 )2 Z ∞ −(y−βx ) 2
1 − 2 1 2 (1−ρ2 ) σ2
fX (x) = √ e 2σ1
p √ e 2σ2
dy, whereβx = µ2 + ρ (x − µ1 ).
σ1 2π −∞ 2
σ2 1 − ρ 2π σ1

the integrand is the PDF of an N (βx , σ22 (1 − ρ2 )) RV, so that


1 − 12 (
x−µ1 2
)
fX (x) = √ e σ1 , −∞ < x < ∞
σ1 2π
So X ∼ N (µ1 , σ12 ).
Similarly Y ∼ N (µ2 , σ22 ).
(b)
(y−β ) 2
f (x, y) 1 − 2 x 2
fY |X (y|X = x) = = p √ e 2σ2 (1−ρ ) , −∞ < y < ∞.
fX (x) σ2 1 − ρ2 2π

which follows N (βx , σ22 (1 − ρ2 ))


we have
σ2
E(Y |X = x) = βx = µ2 + ρ (x − µ1 )
σ1
V ar(Y |X = x) = σ22 (1 − ρ2 )
.
(c) We already know, if X and Y are independent, it implies ρ = 0.
But converse is not true always.
To find Cov(X, Y ),
σ2
E(XY ) = E(E(XY |X)) = E(X[µ2 + ρ (x − µ1 ])
σ1
σ2 2
= µ1 µ 2 + ρ
σ
σ1 1
Cov(X, Y ) = E(XY ) − E(X)E(Y )
= ρσ2 σ1

3
So ρ = 0 implies Cov(X, Y ) = 0,hence independent.
(Important: If (X, Y ) has bivariate normal distribution, X and Y
are independent if and only if ρ = 0)
(d)
X + Y ∼ N (µ1 + µ2 , σ12 + σ22 )

(e)

6. (a)
Z ∞
fX+Y (z) = fX,Y (x, z − x)dx
Z0 z
= fX (x)fY (z − x)dx since X and Y are independent
0
Z z
= λe−λx λe−λ(z−x) dx
0
= λ2 ze−λz , z≥0

for z ≥ 0(check it!!)


Z ∞
fX−Y (z) = fX,Y (x, x − z)dx
0
Z ∞
= fX (x)fY (x − z)dx since X and Y are independent
Zz ∞
= λe−λx λe−λ(x−z) dx
z
λ −λz
= e , z≥0
2
similarly for z ≤ 0.

P (min{X, Y } ≤ z) = 1 − P (X > z)P (Y > z) since X and Y are independent


−λz −λz
= 1−e e
2λe−2λz ,

z≥0
f (z) =
0 z<0

P (max{X, Y } ≤ z) = P (X ≤ z)P (Y ≤ z) since X and Y are independent


−λz 2
= (1 − e )
2λ(1 − e−λz )e−λz ,

z≥0
f (z) =
0 z<0

4
(b) Here U = X + Y, V = X − Y , therefore X = U +V 2 ,Y =
X−Y
2 ,
δ(x,y) 1
and | δ(u,v) | = 2
The joint distribution of U, V is given by
(
δ(x,y)
fX,Y (x, y)| δ(u,v) | = 12 λ2 e−λu , 0 ≤ |v| ≤ u,
fU,V (u, v) =
0 otherwise

fU,V (u, v)
fV |U (v|U = u) =
fU (u)
1
= , u > 0, |v| ≤ u
2u

(c) The joint distribution of U, Z is given by


(
fX,Y (x, y)| δ(x,y) 2 −λu ,
δ(u,z) | = λ ue u ≥ 0, 0 ≤ z ≤ 1,
fU,Z (u, z) =
0 otherwise

and also
fU,Z (u, z) = fU (u)fZ (z)

hence independent.

7. Let Xi be the random variable defined by the number of drawings to


get ith object out of N distinct objects after drawing i − 1th objects.
therefore each Xi ∼ geometric(pi = N −(i−1)
N ) with mean 1/pi
by definition

X = X1 + X2
E(X) = E(X1 ) + E(X2 )
N 2N
= 1+ =
N −1 N −1
Second part

E(X) = E(X1 ) + E(X2 ) + . . . + E(Xn )


n
X N
=
N − (i − 1)
i=1

V ar(X) = V ar(X1 ) + V ar(X2 ) + . . . + V ar(Xn ) ∵ Xi ’s are independent


N 2N (n − 1)N
= 2
+ 2
+ ...|
(N − 1) (N − 2) (N − n + 1)2

5
8.
z
P (XY ≤ z) = P (X ≤ )
Y
Z 1
z
= P (X ≤ |Y = y)fY (y)dy
0 Y
Z 1 Z min{1,z/y}
= 1dxdy
0 0
Z 1
= min{1, z/y}dy
0

1 0 < y < z,
min{1, z/y} = g(y) =
z/y z ≤ y < 1

 0, z ≤ 0,
∴ P (XY ≤ z) = z − zlnz, 0 < z < 1,
1, z > 1


−lnz, 0 < z < 1,
∴ fZ (z) =
0 otherwise
9. Here Z = 2X −Y and W = −X +Y , therefore X = Z +W, Y = Z +2W
the joint distribution is given by
δ(x, y) 1 − 2z2 +5w2 +6zw
fZ,W (z, w) = fX,Y (x, y)| |= e 2 , −∞ < z < ∞, −∞ < w < ∞
δ(z, w) 2π
which follows jointly normal distribution. and moreover Cov(Z, W ) =
−3 6= 0, hence not independent.
10. Given that

1/2x, |y| ≤ x,
fY |X (y|X = x) =
0, otherwise
The joint distribution of X, Y is given by

1, |y| ≤ x, 0 ≤ x ≤ 1
fX,Y (x, y) = fY |X (y|X = x)f (x) =
0, otherwise
The PDF of Y is given by
( R
1
|y| 1dx= 1 − |y|, |y| ≤ 1
fY (y) =
0, otherwise
Z 1
P (|Y | < X 3 ) = P (|Y | < X 3 |X = x)fX (x)dx
0
1
2x3
Z
= .2xdx
0 2x
1
=
2

6
11. The PDF of Y given X = x is given by
 −xy
xe , y ≥ 0,
fY |X (y|X = x) =
0, otherwise

E(Y |X) = 1/X,


E(Y ) = E(E(Y |X))
= E(1/X)
Z 2
= 1/xdx ∵ X ∼ U (1, 2)
1
= ln2
E(XY ) = E(E(XY |X))
= 1
Cov(X, Y ) = E(XY ) − E(X)E(Y )
3 1+2 3
= 1 − ln2 (∵ E(X) = = )
2 2 2

12. Important note

(i) Cov(X + a, Y ) = Cov(X, Y ) for any constant a


(ii)
Cov(X, X) = V ar(X)

(iii)
Cov(X + Y, Z) = Cov(X, Z) + Cov(Y, Z)

(iv)
Cov(aX, Y ) = aCov(X, Y )

(v)
Cov(X, Y ) = Cov(Y, X)

since X, Y ∼ N (0, 1),


E(X) = E(Y ) = 0, E(X 2 ) = E(Y 2 ) = 1.

Cov(1 + X + XY 2 , 1 + X) = Cov(X + XY 2 , X) [by(i)]


= Cov(X, X) + Cov(XY 2 , X) [by(iii)]
2 2 2
= V ar(X) + E(X Y ) − E(XY )E(X) [by(ii)
2 2
and X, Y are independent, so as X , Y ]
= 2

You might also like