You are on page 1of 4

MA225 - Tutorial 9 - Solutions

1. Here the randomness of Xn is coming from the random variable R.


in particular for R = r

Xn = 1000(1 + r)n , n = 0, 1, 2, . . .

Here the sample functions are of the form f (n) = 1000(1 + r)n , n =
0, 1, 2, . . ., where r ∈ (0.04, 0.05) The random variable X3 is given by

X3 = 1000(1 + R)3

Let 1 + R = Y ∼ U (1.04, 1.05). The PDF of Y is given by



100, 0.04 < y < 0.05
fY (y) =
0, otherwise
E(X3 ) = 1000E(Y 3 )
Z 1.05
= 1000 y 3 .100dy ≈ 1, 141.2
1.04
Z 1.05
E(Xn ) = 1000 y n .100dy
1.04
105
= [(1.05)n+1 − (1.04)n+1 ]
n+1
Cov(Xn , Xm ) = E(Xn Xm ) − E(Xn )E(Xm )
E(Xn Xm ) = 106 E(Y m+n )
Z 1.05
6
= 10 y n+ m .100dy
1.04
108
= [(1.05)m+n+1 − (1.04)m+n+1 ]
m+n+1
108
Cov(Xn , Xm ) = [(1.05)m+n+1 − (1.04)m+n+1 ]
m+n+1
105 105
− [(1.05)n+1 − (1.04)n+1 ] [(1.05)m+1 − (1.04)m+1 ]
n+1 m+1

2. To show {Xt , t ∈ T } is a strictly stationary process we need to show


∀t1 , t2 , . . . , tn ∈ R and ∀h ∈ R

(Xt1 , Xt2 , . . . , Xtn ) ≡d (Xt1 +h , Xt2 +h , . . . , Xtn +h )

Since the process is iid, the distribution of (Xt1 , Xt2 , . . . , Xtn ) and
(Xt1 +h , Xt2 +h , . . . , Xtn +h ) are same.
And for X to be Gaussian process, Xt has to be normal for each t.

3. To show weakly stationary process, we need to check

1
(i) E[X(t)] = µX (t) = µX , ∀t ∈ R,
(ii) Cov[X(t), X(s)] = f (|t − s|), ∀t, s ∈ R

Z 2π
1
E[X(t)] = E[(cos(t + U )] = cos(t + u) du
0 2π
= 0, ∀t ∈ R
Cov[X(t), X(s)] = E[X(t)X(s)] − E[X(t)]E[X(s)]
1
E[X(t)X(s)] = [cos(t + s + 2U ) + cos(t − s)]
2
1
= cos(t − s), ∀t, s ∈ R
2

Hence {X(t), t ∈ R} is weakly stationary process.


4. We need to show that
∀t1 , t2 , . . . , tn ∈ R and ∀h ∈ R

(Xt1 , Xt2 , . . . , Xtn ) ≡d (Xt1 +h , Xt2 +h , . . . , Xtn +h )

Since the process is Gaussian, it is sufficient to show that their mean


vector and covariance matrix are same.
Since it is covariance stationary

E[Xti ] = E[Xtj ] = µX ∀i, j

Cov[Xti , Xtj ] = f (|ti − tj |) ∀i, j


From above we conclude that the mean vector and Covariance matrix
of them are same and hence strictly stationary.
5. Note that for any k ∈ Z, outside the interval (k, k + 1], gk (t) = 0, and
in k < t ≤ k + 1, X(t) = Ak .
Thus

E[X(t)] = E[Ak ] = 1 ∀t ∈ R

If for some k ∈ Z, k < t, s ≤ k + 1

Cov[X(t), X(s)] = E[X(t)X(s)] − E[X(t)]E[X(s)]

E[X(t)X(s)] = E[A2k ] = 1 + 1 = 2 and


∴ Cov[X(t), X(s)] = 1.
Again if for some l, k ∈ Z with l 6= k and l < s ≤ l + 1, k < t ≤ k + 1

E[X(t)X(s)] = E[Ak As ] = E[Ak ]E[As ] = 1

∴ Cov[X(t), X(s)] = 0.

2
6.

E[Xn+1 |X1 , X2 , . . . Xn ] = E[Yn+1 Xn |X1 , X2 , . . . Xn ]


= Xn E[Yn+1 |X1 , X2 , . . . Xn ]
= Xn E[Yn+1 ] [∵ Yi ’s are iid]
= Xn

Hence the process is martingale.

7. Let Zn denote the gain of gambler in the nth game. The gambler
stands to gain 1 unit of money with probability p and to lose 1 unit of
money with probability q and Xn be the gambler’s fortune after the
nth game.
Therefore,

Xn = Z1 + Z2 + . . . + Zn

Again E[Zi ] = p − q for all i, which is equal to zero iff p = q; then


{Xn , n ≥ 0} is a martingale. since

E[Xn+1 |X1 , X2 , . . . Xn ] = E[Zn+1 + Xn |X1 , X2 , . . . Xn ]


= Xn + E[Zn+1 |X1 , X2 , . . . Xn ]
= Xn + E[Zn+1 ] [∵ zi ’s are iid]
= Xn [∵ E[Zn+1 = 0]]

Note: A stochastic process {Xn , n ≥ 0} having E{|Xn |} < ∞is called


a if sub − martingale

E[Xn+1 |X1 , X2 , . . . Xn ] ≥ Xn

and is called a super − martingale if

E[Xn+1 |X1 , X2 , . . . Xn ] ≤ Xn

When P < 21 , the gambler’s fortune {Xn , n ≥ 0} is a super-martingale.


When p > 21 , the gambler’s fortune {Xn , n ≥ 0} is a sub-martingale
(check it).

8. Here the inter-occurrence period are geometric distribution with pa-


rameter 1/6. Hence the number of occurrence of the event E in m
trials is renewal process.

3
9. Let {X(t), t ≥ 0} with X(0) = 0 is a process having independent
increments.

Cov[X(s), X(t)] = E[X(t)X(s)] − E[X(t)]E[X(s)]

for t < s

E[X(t)X(s)] = E[X(t){X(t) + X(s) − X(t)}]


= E[X(t)]2 + E[X(t)]E[(X(s) − X(t)]
(since the process has independent increments)
= E[X(t)]2 + E[X(t)](E[(X(s)] − E[X(t)])
Cov[X(s), X(t)] = E[X(t)]2 + E[X(t)](E[(X(s)] − E[X(t)]) − E[X(t)]E[X(s)]
= V ar[X(t)]

Therefore

Cov[X(s), X(t)] = V ar[X(min{s, t})]

which implies that the process is not Covariance stationary.

10. Let U (t) = aW1 (t) + bW2 (t). We need to check that U (t) will satisfy
all the properties of Brownian motion.

(i) U (t) have the continuous sample path a.s., since W1 (t), W2 (t)
have the continuous sample path a.s.
(ii)

U (0) = aW1 (0) + bW2 (0) = 0 ∴ W1 (0) = W2 (0) = 0

(iii) U (t) will have the stationary increments by the same argument.
(iv) since W1 (t)−W1 (s) and W2 (t)−W2 (s) follow normal distribution
with mean 0 and variance t − s,
U (t) − U (s) will also follow normal distribution.

E[U (t) − U (s)] = aE[W1 (t) − W1 (s)] + bE[W2 (t) − W2 (s)] = 0

V ar[U (t)−U (s)] = a2 V ar[W1 (t)−W1 (s)]+b2 V ar[W2 (t)−W2 (s)] = (a2 +b2 )(t−s)

So, for U (t) to satisfy the properties of Brownian motion, we should


have a2 + b2 = 1.

You might also like