You are on page 1of 7

MA225 Probability Theory and Random Processes

July - November 2017

Problem Sheet 4 : Answers


1. Let X: Number of calls coming during a particular minute.
1
Therfore X ∼ P oison(1000 × 800 ).
P {a call coming during a particular minute has to wait} = P {X > 24}

X (1.25)i
= e−1.25
i!
i=25

2. RTo find the value of c, we need to satisfy f (x) ≥ 0, ∀x ∈ R and



−∞ f (x) = 1.
HereRf (x) ≥ 0 implies c ≥ 0.

and 1 f (x)dx = 1 implies c = 2.
Z ∞ Z ∞
2
E(X) = xf (x)dx = x 3 dx = 2
1 1 x
and Z ∞
2 2
E(X ) = dx x2
1 x3
which is an improper integral and does not convergent.
Hence V ar(X) does not exist.
Similarly we can define

2

x2
, x>1
f (x) =
0 otherwise
where E(X) does not exist.
3. By the definition X is an indicator function given by

1, x ∈ A
X = IA (x) =
0 x∈ /A
E(X) = 1.P (X ∈ A) + 0.P (X ∈ / A) = P (A)
V ar(X) = E(X 2 ) − [E(X)]2
= P (A) − [P (A)]2
= P (A)[1 − P (A)]

4. Given that Y = 1/X or X = 1/Y .


and
dx 1
= − 2, y ≥ 1
dy y
since Y = 1/X = g(X)(say) and g 0 (x) < 0, for all 0<x≤1
We have
dx
fY (y) = | |fX (x)
dy
1 4
= | 2 |. 3 , y ≥ 1
y y

1
Therefore
 1 4
y2 y3
, y≥1
fY (y) =
0 otherwise
and the distribution function is given by

0,
R y 4y < 1
FY (y) = 1
1 y 5 dy = 1 − y4
, y≥1

Z ∞
4
E(Y ) = y dy
1 y5
4
=
3

Z ∞
2 4
E(Y ) = y2 dy
1 y5
= 2
Therefore
4 2
V ar(Y ) = 2 − ( )2 =
3 9
5. Here g(x) = x2 , g 0 (x) = 2x and
g 0 (x) > 0 for x > 0, g 0 (x) < 0 for x < 0
and also note that for given range of X, Y varies from 0 to ∞.
√ √
the roots of y − x2 = 0 are, for x1 = y and x2 = − y for any y ≥ 0
and there is no solution for y ≤ 0.
It follows that
(
fX (x1 ) fX (x2 )
fY (y) = |g 0 (x1 )| + |g 0 (x2 )| , y>0
0, y ≤ 0
Thus (
1 √ √ √ 1 e−y/2 ,
2 y [fX ( y) + fX (− y)] = y>0

fY (y) = 2πy
0, y≤0

Again from basic principles


√ √
P {Y ≤ y} = P {− y ≤ X ≤ y}
√ √
= FX ( y) − FX (− y)
where FX is DF of X. Thus pdf of X is given by
(
√ 1 e−y/2 , y>0
fY (y) = 2πy
0, y ≤ 0

2
6. The pdf of X is given by
2

3π , −π/2 < y < π
fY (y) =
0 otherwise

Here g(x) = sinx, g 0 (x) = cosx.


and further the pdf of Y exists for −1 ≤ y ≤ 1.
First consider o < y < 1.
the roots of y − sinx = 0 are,
x1 = sin−1 (y) and x2 = π − sin−1 (y).
Therefore,

fX (sin−1 (y)) fX (π − sin−1 (y))


fY (y) = p + p
1 − y2 1 − y2
2 1
= p
3π/2 1 − y 2
4 1
= p 0<y<1
3π 1 − y 2

Similarly for −1 < y < 0, there is only one root x1 = sin−1 (y).

fX (x1 )
fY (y) =
g 0 (x1 )
2 1
= p
3π 1 − y 2

Thus

4 √1
 3π , 0<y<1

 1−y 2
2 √1
fY (y) = 3π −1<y <0 (1)

 1−y 2
 0 otherwise

7.
Z ∞ Z n
E(X) = xf (x)dx = lim xf (x)dx
0 n→∞ 0

On integration by parts, we obtain


Z n Z n Z n
xf (x)dx = nF (n) − F (x)dx = −n[1 − F (n)] + [1 − F (x)]dx
0 0 0

But Z ∞ Z ∞
n[1 − F (n)] = n f (x)dx < xf (x)dx
n n

3
and since E(|X|) < ∞,
n[1 − F (n)] → 0 as n → ∞.

Therefore,
Z n Z ∞ Z ∞
E(X) = lim [1 − F (x)]dx = [1 − F (x)]dx = P {X ≥ x}dx
n→∞ 0 0 0

8. Here X ∼ N (0, 1).


By definition
Z ∞
1 x2
E(X n+1
) = xn+1 . √ e− 2 dx

Z−∞

1 x2
= xn . √ xe− 2 dx

Z−∞

1 x2
= xn . √ (−e− 2 )0 dx
−∞ 2π
Z ∞
1 x2 x2
= √ [xn (−e− 2 )|∞ −∞ + n xn−1 e− 2 dx]
2π −∞
= nE(X n−1 ), for all n ≥ 1
Since E(X) = 0, E(X 0 ) = 1
By recursive relation,
E(X 2n ) = (2n − 1).(2n − 3).(2n − 5) . . . 3.1
E(X 2n+1 ) = 0

9. Given that X ∼ N (−5, 4)


Let Z = X+5
2 , so Z ∼ N (0, 1).
P {X < 0} = P {2Z − 5 < 0}
5
= P {Z < }
2
5
= Φ( )
2
P {−7 < X < −3} = P {−7 < 2Z − 5 < −3}
= P {−1 < Z < 1}
= Φ(1) − Φ(−1)
= Φ(1) − [1 − Φ(1)]
= 2Φ(1) − 1
P (X > −3)
P {X > −3|X > −5) =
P (X > −5}
P {Z > 1}
=
P {Z > 0}
1 − Φ(1)
=
1 − Φ(0)

4
10. The pdf of Gamma distribution is given by
( α α−1 −λx
λ x e
Γ(α) if x > 0
f (x; λ, α) =
0otherwise
Z ∞
λα
E(X) = x.xα−1 e−λx dx
Γ(α) 0
Z ∞
λα
= xα e−λx dx
Γ(α) 0
λα Γ(α + 1)
=
Γ(α) λα+1
α
=
λ Z ∞
λα
E(X 2 ) = x2 .xα−1 e−λx dx
Γ(α) 0
Z ∞
λα
= xα+1 e−λx dx
Γ(α) 0
λα Γ(α + 2)
=
Γ(α) λα+2
(α + 1)α
=
λ2
(α + 1)α α
V ar(X) = − ( )2
λ2 λ
α
=
λ2
11.
Y = min(X, 1/2)

X if X < 1/2
=
1/2 if X ≥ 1/2
FY (y) = P {Y ≤ y}

 0R if y < 0
y
= 2xdx = y 2 if 0 ≤ y < 1/2
 0
1 if y ≥ 1/2
1 3 3 1 1 5
P { ≤ Y ≤ } = FY ( ) − FY ( ) + P {Y = =
4 8 8 4 4 64
1 1 1 15
P {Y ≥ } = 1 − FY ( ) + P {Y = } =
4 4 4 16
Any distribution function F can be decomposed into two parts as
F (x) = aFd (x) + (1 − a)Fc (x) 0≤a≤1

where Fd : DF of a discrete RV and Fc : DF of continuous RV.

5
Here
3 1
FY (y) = FYd (y) + FYc (y)
4 4
where DF of discrte part is given by

0, y < 21

FYd (y) =
1 y ≥ 12
1 1
P {Yd = } = 1 mean E(Yd ) =
2 2
and DF of continuous part is given by

 0, y < 0
FYc (y) = 4y 2 0 ≤ y < 12
1 y ≥ 12

and pdf
8y 0 ≤ y ≤ 21

fYC (y) =
0 otherwise
1
and the mean E(Yc ) =
3
Hence
3 1 1 1 11
E(Y ) = . + . =
4 2 4 3 24
12. Proof of Markov’s inequality:
Let a be any positive real number.
Define a random variable Y given by

0, if X < a
Y =
a if ≥ a

E(Y ) = 0.P {X < a} + a.P {X ≥ a}


= aP {x ≥ a}
clearly X≥Y whch implies E(X) ≥ E(Y )
therefore E(X) ≥ aP {X ≥ a}
E(X)
or P {X ≥ a} ≤
a

13. Proof of Chebyshev’s inequality:


Using Markov’s inequality
We have
E(|X − µ|2 )
P {|X − µ| ≥ a} = P {|X − µ|2 ≥ a2 } ≤
a2
The proof will follow putting a = σt.

6
14. consider
1 1 1
P {X = 1} = , P {X = 0} = 1 − 2 , P {X = −1} = 2
2t2 t 2t
1
Therefore E(X) = 0, V ar(X) = t2

1
P {|X − 0| ≥ σt} = P {|x| ≥ 1} =
t2
so the bound in Chebyshev’s inequality cannot be reduced further.

You might also like