Professional Documents
Culture Documents
1. True/False/Uncertain, Please briefly explain (no explanation no point). Answer any five.
5 X 7 = 35 points
b. If the joint pdf of (X,Y) is f(x,y)= xy+ y if a < x < b and c < y < x, and f(x,y) = 0 if
otherwise, then X and Y are not independent.
Because the range of Y depends on the value of X, the conditional pdf of Y|X cannot be
equal to its marginal pdf.
c. The only difference between F(x) and F(x|y) is that in the latter case you know the
value of y, but the distribution of X is the same in both cases.
False: Independence has nothing to do with the "number of experiments." If it did, then
we would always face existential questions like "Is flipping a coin twice one experiment
or two?" Additionally, consider selecting an individual at random from the MSE student
population. There is no reason that the events "the student is male" and "the student is
originally from Chennai" cannot be independent. Clearly, this is only one experiment;
the events can potentially be independent.
e. Assume that events A and B exhaust S. Then, the intersection between the union of A
and B and the intersection of A and B, is the empty set.
For A and B that exhaust S, [(AUB) ∩ (A∩B)] = A ∩ B. This will be equal to empty
set only if A and B are disjoint sets.
f. Whenever you need to perform a transformation of one or more random variables,
you need to compute the Jacobian.
We do not need it always for example in case of discrete random variable.
1
b. Calculate E(Y|X) and Var(Y|X)?
c. What is E(Y) and Var(Y)?
a)
The distribution of Y|X is the distribution of a + bX + e where e is the only random variable
and a + bX is fixed. Let us define K = a + bX. Therefore, the distribution of K + e is the
distribution of e shifted by K, or U[K-1/2, K+1/2]. Thus Y|X ~ U[a + bX-1/2, a + bX+1/2].
b)
a + 2ab + b 2 (b + a ) 2 (b − a ) 2
2
Var ( Z ) = E ( Z 2 ) − [ E ( Z )]2 = − =
3 4 12
Therefore, for U [a + bX − 1 / 2, a + bX + 1 / 2]
a + b a + bX − 1 / 2 + a + bX + 1 / 2 a + bX
E (Y | X ) = = =
2 2 2
(b − a ) 2 ((a + bX + 1 / 2) − (a + bX − 1 / 2)) 2 1
Var (Y | X ) = = =
12 12 12
c)
E (Y ) = E (a + bX + e) = E (a ) + bE ( X ) + E (e) = a
Var (Y ) = Var (a + bX + e) = b 2 var( X ) + var(e) + 2b cov( X , e)
[0 + 0.5 − (0 − 0.5)]2 1
= b 2σ 2 + + 2b * 0 = b 2σ 2 +
12 12
Or
a. Calculate the moment generating function of X. Use the MGF to find the
mean and variance of X.
b. Use the Chebyshev inequality to calculate an upper bound on the probability
that X is outside the interval [0.5, 3.5].
c. Now use the fact that X is distributed uniformly over the interval [0, 4] to
calculate the probability that X is outside the interval [0.5, 3.5]. Is it higher or
lower than the answer you got in part b?
2
a)
The moment generating function is:
M X (θ ) = E (eθX )
Since X ~ U[0,4]
41 1 4θ
M X (θ ) = E (eθX ) = ∫ eθx dx = (e − 1)
04 4θ
∂ ∂ 1 4θ e 4θ e 4θ − 1
E( X ) = M X (θ ) |θ = 0 = (e − 1) |θ = 0 = − |θ = 0
∂θ ∂θ 4θ θ 4θ 2
4θe 4θ − e 4θ + 1 16θe 4θ + 4e 4θ − 4e 4θ
= |θ = 0 = |θ = 0 = 2
4θ 2 8θ
∂2 4θe 4θ − e 4θ 16θ 2 e 4θ − 8θ (e 4θ − 1)
E( X 2 ) = M X (θ ) |θ = 0 = − |θ = 0
∂θ 2 4θ 2 16θ 4
4θe 4θ − e 4θ 2θe 4θ − e 4θ + 1
= − |θ = 0
4θ 2 2θ 3
8θ 2 e 4θ − 4θe 4θ + e 4θ − 1
= |θ = 0
2θ 3
32θ 2 e 4θ + 16θe 4θ − 16θe 4θ − 4e 4θ + 4e 4θ
= |θ = 0
6θ 2
16e 4θ
= |
3 θ =0
16
=
3
16 4
Var ( X ) = −4=
3 3
b)
The Chebyshev inequality states that:
Var ( X )
Pr( X − E ( X ) ≥ θ ≤
θ2
E( X ) = 2
4
Var ( X ) =
3
3
θ=
2
3 (4 / 3) 16
Pr( X ∉ (0.5,3.5)) = Pr( X − 2 ≥ ) = =
2 2 27
(3 / 2)
3
c)
0.5 1 4 1 1 1 1 13 1
Pr( X ∉ (0.5,2.5)) = ∫ dx + ∫ dx = [ x]00.5 + [ x]24.5 = + =
0 4 2.5 4 4 4 8 42 2
3. Suppose that the joint pdf of two random variables X and Y is: (40 marks)
Since the exponential function is always positive, the pdf satisfies the positivity requirement.
Therefore, we need only check that the pdf integrates to 1 over the entire sample space.
Notice that X and Y are independent: f ( x, y ) = f X ( x) f Y ( y ) = e − x − y = e − x e − y
a)
∞∞ ∞ ∞
−x −y
∫ ∫e e dydx = ∫ e − x dx ∫ e − y dy = [−e − x ]∞ −y ∞
0 [ −e ]0 = (−0 + 1)(−0 + 1) = 1
00 0 0
f ( x) = f ( x | y ) = e − x
From part a), we know that both the marginal and conditional distribution integrates to 1.
d)
4
∞ x −1
Pr( X − Y > 1) = Pr(Y < X − 1) = ∫ ∫ e − x − y dydx
1 0
1 x ∞x
− x− y
Pr( X + Y > 1, X > Y ) = Pr(Y < X − 1) = ∫ ∫e dydx + ∫ ∫ e − x − y dydx
0.5 1− x 10
e)
Let us define Z = X/Y.
X ∞ ∞ ∞ ∞
F ( z ) = Pr( Z ≤ z ) = Pr( ≤ z ) = Pr( X ≤ zY ) = ∫ ∫ e − x − y dydx = ∫ e − x ∫ e − y dydx
Y 0x/ z 0 x/z
∞ ∞ − e − x (1+1 / z ) ∞ 1 z
= ∫ e − x e − x / z dx = ∫ e − x (1+1 / z ) dx = | 0 = −(0 − )=
0 0 1 + 1/ z 1 + 1/ z z +1