You are on page 1of 5

• 1. Clearly joint distribution of (X, Y ) is multinomial.

Hence

n!
P (X = i, Y = j) = (1/6)i (2/6)j (3/6)n−i−j
i!j!(n − i − j)!

Now marginals ,

n−i  
X n! n
P (X = i) = (1/6)i (2/6)j = (i/6)i (5/6)n−i ∼ Bin(n, 1/6)
i!j!(n − 1 − j)! i
i=1

Similarly
 
n
P (Y = j) = (2/6)j (4/6)n−j ∼ Bin(n, 1/3)
j

Thus E(Y ) = n/3 and V ar(Y ) = n.1/3.2/3 = 2n/9

• 2.
X = no. of customer who purchase drink. Thus, X = sum of inde-
pendent random variables which follow Bernouli(p)
Now X|n ∼ Bin(n, p).
Hence

X
PX (x = k) = P (X = k|N = n)PN (n)
n=k
∞ 
n k n−k (−λ) λn
X 
= p q e
k n!
n=k
(λp)k
= e(−λp)
k!
Thus X ∼ P oi(λp) and E(X) = λp

• 3.
Marginal PMF of X:

PX (0) = P (X = 0, Y = 0) + P (X = 0, Y = 1) = 3/5

PX (1) = P (X = 1, Y = 0) + P (X = 1, Y = 1) = 2/5.

Similarly marginal PMF of Y , PY (0) = 3/5, PY (1) = 2/5.


Conditional PMF:
PXY (0,0)
PX|Y (0|0) = PY (0) = 1/3 and

1
PX|Y (1|0) = 1 − PX|Y (0|0) = 2/3.
Now, random variable Z = E(X|Y ) is given by
(
E(X|Y = 0), P (Y = 0)
Z = E(X|Y ) =
E(X|Y = 1), P (Y = 1)

Since, E(X|Y = 0) = 1.PX|Y (1|0) + 0 = 2/3 and E(X|Y = 1) = 0


(
2/3, P (Y = 0) = 3/5
Z = E(X|Y ) =
0, P (Y = 1) = 2/5

Hence PMF of Z,

3/5, z = 2/3

PZ (z) = 2/5, z = 0

0, otherwise

Now, E(Z) = 2/3.3/5+0.2/5 = 2/5 and E(X) = E(Z) = E(E(X|Y )) =


2/5.
V ar(Z) = E(Z 2 ) − (E(Z))2 = 8/75
Similarly way other parts can be done easily. Please try to do it.
• 4.
PN
Given Y = i=1 Xi .
Now
E(Y ) = E[E(Y |N )]
X∞
= E[E[ Xi |N ]]
i=1

X
= E[E E(Xi )] since Xi and N are independent
i=1
= E(N )E(X)
Now

V ar(Y ) = E(V ar(Y |N )) + V ar(E(Y |N ))


= E(V ar(Y |N )) + V ar(N E(X))
= E(V ar(Y |N )) + (E(X))2 V ar(N )

Now V ar(Y |N ) = ∞
P PN
i=1 V ar(Xi |N ) = i=1 V ar(Xi ) = N V ar(X)
Therefore V ar(Y ) = E(N )V ar(X) + (E(X))2 V ar(N )

2
• 5
Given X ∼ Exp(λ) and fX (x) = λe−λx .
PDF of X given X > 1:
fX (x)
fX|X>1 (x) =P (X>1) .
R∞
Now P (X > 1) = 1 λe(−λx) dx = e−λ .
Hence fX|X>1 (x) = λe−λx+λ .
DF of X given X > 1:
For X > 1,
FX (x) − FX (1)
FX|X>1 =
1 − FX (1)
1 − λe−λx − (1 − λe−λ )
=
1 − 1 + e−λ
λe − λe−λx
−λ
=
e−λ
= λ(1 − e−λ(x−1) )

(
λ(1 − e−λ(x−1) ) x>1
∴ FX|X>1 (x) =
0 otherwise

Now one can easily compute,


R∞ R∞
E(X|X > 1) = 1 xfX|X>1 (x)dx and E(X 2 |X > 1) = 1 x2 fX|X>1 (x)dx.
V ar(X|X > 1) = E[X 2 |X > 1] − (E[X|X > 1])2 .

• 6
R2 R2 1
E(Y ) = E(E(Y |X)) 1 E(Y |X = x)fX (x)dx = 1 x dx = log(x)
and
V ar(Y ) = E(Y 2 ) − (E(Y ))2 = E(Y 2 ) − (log2)2 = E[E[Y 2 |X]] −
(log2)2 = E( X12 ) − (log2)2 = 1 − (log2)2

• 7.
No.
Example:
(
1+xy
4 , |x| < 1, |y| < 1
f (x, y) = P (X ≤ x, Y ≤ y) =
0, otherwise

Now f (x, y) 6= f (x)f (y). Hence X and Y are not independent.

3
√ √
P (X 2 ≤ u, Y 2 ≤ v) = u v = P (X 2 ≤ u)P (Y 2 ≤ v)
Thus (X 2 , Y 2 ) are independent but (X, Y ) are not since they are not
borel measurable functions of X 2 and Y 2 respectively.

• 8.
If X and Y are independent random variables then their borel mea-
surable function g(X) and h(Y ) are independent. Thus

P (g(X) ∈ A, h(Y ) ∈ B) = P (X ∈ g −1 (A), Y ∈ h−1 (B)) [since borel measurable]


= P (X ∈ g −1 (A))P (Y ∈ h−1 (B)) [since independent]
= P ((g(X) ∈ A)P ((h(Y ) ∈ B)

Thus g(X) and h(Y ) are independent.


here X 2 and Y 2 are borel measurable functions of X and Y respec-
tively. Thus they are independent.

• 9.
Choose A = 0, B = 1 Now joint distribution of (X, Y ) is given by



0, x < 0 or y < 0

1, x ≥ 1, y ≥ 1



F (x, y) = P (X ≤ x, Y ≤ y) = P (X ≤ x) = x 0 < x < 1, y ≥ 1

P (Y ≤ y) = y 0 < x ≥ 1, 0 < y ≥ 1





xy 0 ≤ x, y ≤ 1

Now the density is given by


(
0 0 < x, y < 1
f (x, y) = P (X ≤ x, Y ≤ y) =
1 otherwise

Now Z = min{Y, 1/2}. Thus the joint distribution of (X, Z) is




0, x ≤ 0 or y ≤ 0

1, x ≥ 1, z ≥ 1/2



F (x, z) = P (X ≤ x, Z ≤ z) = x 0 < x < 1, z ≥ 1/2

y 0 < x ≥ 1, 0 < z ≥ 1/2




xy 0 ≤ x < 1, 0 < y ≤< 1/2

The distribution is not continuous at z = 1/2.

4
• 10
Both are density function.
R +1 one can easily check it. Now the marginals
of f (x, y) are f (x) = −1 f (x, y)dy = 1/2 and f (y) = 1/2.
Similarly marginals of g are g(x) = 1/2 and g(y) = 1/2.
Thus in second case the g(x, y) = g(x)g(y). Hence independent. We
also note that both marginals are same but the joint distribution is
different.

• 11
a. Marginals:
R 1−x
f (x) = 0 f (x, y)dy = k/2[x2 − 1] and f (y) = k/2[1 − y 2 ].
b. Now the conditional density of X given Y :

f (x, y) k(1 − x − y)
f (x|y) = =
f (y) k/2(y 2 − 1)
where 0 < x < y − 1.
Thus the conditional distribution is given by
Z x
2(x − x2 /2 − xy)
F (x|y) = f (x|y)dx =
0 y2 − 1
R1Ry
c. P (X < Y ) = 0 0 f (x, y)dxdy
R1
d. P (X > 1/2) = 1/2 f (x)dx
Similarly others part can be done easily.

You might also like