Professional Documents
Culture Documents
Spring 2016
Problem Set 4
Author: Xiaobin Gao
1. Problem 1
Proof. Let X be a random vector such that
X1
X2
X = X
3
X4
Denote by X (u) the characteristic function of X, where u = [u1 u2 u3 u4 ]T . Then,
X (u) = E[exp(juT X)]
= E[exp(j(u1 X1 + u2 X2 + u3 X3 + u4 X4 ))]
=
k
X
j
k=0
k!
E (u1 X1 + u2 X2 + u3 X3 + u4 X4 )k
(1)
On the other hand, since X has jointly Gaussian distribution with zero mean,
1
X (u) = exp( uT Ku)
2
where K is the covariance matrix such that the ij-th element of K, denoted by Kij , satisfies
Kij = Cov(Xi , Xj ). Furthermore, Kij = Kji , and
Cov(Xi , Xj ) = E[Xi Xj ] E[Xi ]E[Xj ] = E[Xi Xj ]
Hence,
4
X (u)
u1 u2 u3 u4
u1 ,u2 ,u3 ,u4 =0
4
1 T
exp( 2 u Ku)
u1 u2 u3 u4
u1 ,u2 ,u3 ,u4 =0
(2)
2. Problem 2
(a) Since X and Y have jointly Gaussian distribution,
X+ Y has Gaussian distribution with
1
5 1 1
mean [1 1]
= 2 and variance [1 1]
= 4. Hence,
1
1 1
1
4 2
P (X + Y > 4) = Q
= Q(1)
4
(b) Condition on the event that Y = y, X has Gaussian distribution with mean E[X|Y
= y]
and variance Cov(e), where
E[X|Y
= y] = E[X] + Cov(X, Y )Cov(Y, Y )1 (y E[Y ]) = 2 y
Cov(e) = Cov(X, X) Cov(X, Y )Cov(Y, Y )1 Cov(Y, X) = 4
Hence,
(x + y 2)2
1
fX|Y (x|y) = exp
8
8
(c)
E[X 2 |Y ] = Var(X|Y ) + (E[X|Y ])2 = 4 + (2 Y )2
(d)
P (X 2|Y = 2) = Q
2 (2 y)
= Q(1)
4
(e)
E[X 2 Y ] = E E[X 2 Y |Y ] = E Y E[X 2 |Y ] = E[4Y + (2 Y )2 Y ]
= E[Y 3 ] 4E[Y 2 ] + 8E[Y ] = 4
where E[Y 3 ] can be obtained from the moment generating function of Y . Hence,
Cov(X 2 , Y ) = E[X 2 Y ] E[X 2 ]E[Y ] = 2
Then,
2 |Y ] = E[X 2 ] + Cov(X 2 , Y )Cov(Y, Y )1 (Y E[Y ]) = 8 2Y
E[X
3. Problem 3
X (t)
2 sin(10(s t)) , if s 6= t
5(s t)
=
4,
if s = t
40
30
20
Xt
10
-10
-20
-30
-10
-8
-6
-4
-2
10
and
r
h 1 i Z 1 s s2
2
2
=
e s ds =
E
2
S
s s
2s2
0
Hence,
2
E[A] = R
2s2
5. Problem 5
(a)
P W3
W2 + W4
W3 W2 W4 W3
+1 =P
1
2
2
2
10
W3 W 2 W4 W3
1 = Q( q ) = Q( 2)
2
2
1
2
W
t
t
Wt2 d.
t
2 (1).
6. Problem 6
P (N1 1, N2 = 2)
= P (N1 = 1, N2 = 2) + P (N1 = 2, N2 = 2)
= P (N1 N0 = 1)P (N2 N1 = 1) + P (N1 N0 = 2)P (N2 N1 = 0)
= e e +
=
2
e
2 e
32 2
2 e
Furthermore,
P (N1 1) = 1 P (N1 = 0) = 1 P (N1 N0 = 0) = 1 e
and
P (N2 = 2) = P (N2 N0 = 2) =
(2)2 2
e
= 22 e2
2!
(a)
P (N1 1|N2 = 2) =
P (N1 1, N2 = 2)
3
=
P (N2 = 2)
4
(b)
P (N2 = 2|N1 1) =
32 e2
P (N1 1, N2 = 2)
=
P (N1 1)
2(1 e )
7. Problem 7
(a) False. Since
X2 X1.5 = 2 cos() = X1 X0.5 ,
therefore X2 X1.5 and X1 X0.5 are not independent.
(b) True. Since (Xt ) and (Yt ) are independent,
Z (t) = E[Zt ] = E[Xt ]E[Yt ] = X (t)Y (t)
and
RZ (s, t) = E[Zs Zt ] = E[Xs Xt ]E[Ys Yt ] = RX (s, t)RY (s, t)
Furthermore, since (Xt ) and (Yt ) are WSS, X (t), Y (t) are constant and RX (s, t),
RY (s, t) depend on s and t only through s t. Hence, Z (t) is constant and RZ (s, t)
depends on s and t only through s t, which implies that (Zt ) is also WSS.
(c) False. A standard Brownian motion is a martingale (since P (X0 = 0) = 1 and (Xt ) has
independent increments). However, a standard Brownian motion is not WSS (RX (s, t) =
2 min{s, t} =
6 2 min{s + k, t + k} = RX (s + k, t + k)).
8. Problem 8
(a) The characteristic function of Yk , denoted by Yk (u), can be computed as
Yk (u) = E[ejuYk ] = E[eju(Xk +Xk+1 ) ] = E[ejuXk ]E[ejuXk+1 ]
= exp((eju 1)) exp((eju 1)) = exp(2(eju 1))
which is the characteristic function of Poisson distribution with parameter 2.
(b) Proof. Given any t1 , t2 , . . . , tn and s Z,
FX,n (x1 , t1 ; x2 , t2 ; . . . ; xn , tn )
= FXt1 (x1 )FXt2 (x2 ) FXtn (xn )
= FXt1 +s (x1 )FXt2 +s (x2 ) FXtn +s (xn )
= FX,n (x1 , t1 + s; x2 , t2 + s; . . . ; xn , tn + s)
where the first and the third equality hold since {Xk } are independent, and the second
equality holds since {Xk } are identically distributed.
(c) Yes.
Proof. Given any t1 , t2 , . . . , tn and s Z, there exists function G such that
(Yt1 , . . . , Ytn ) = G(Xt1 , Xt1 +1 , . . . , Xtn , Xtn +1 )
and
(Yt1 +s , . . . , Ytn +s ) = G(Xt1 +s , Xt1 +s+1 , . . . , Xtn +s , Xtn +s+1 )
Furthermore, since X is stationary, (Xt1 , Xt1 +1 , . . . , Xtn , Xtn +1 ) and (Xt1 +s , Xt1 +s+1
, . . . , Xtn +s , Xtn +s+1 ) have the same distribution. Hence, (Yt1 , . . . , Ytn ) and (Yt1 +s , . . . ,
Ytn +s ) have the same distribution, which implies that Y is also stationary.