You are on page 1of 5

ECE 534: Random Processes

Spring 2016

Problem Set 4
Author: Xiaobin Gao

Due Date: 3/28

1. Problem 1
Proof. Let X be a random vector such that

X1
X2

X = X
3
X4
Denote by X (u) the characteristic function of X, where u = [u1 u2 u3 u4 ]T . Then,
X (u) = E[exp(juT X)]
= E[exp(j(u1 X1 + u2 X2 + u3 X3 + u4 X4 ))]
=

k
X
j
k=0

k!



E (u1 X1 + u2 X2 + u3 X3 + u4 X4 )k

It can be checked that




4
X (u)
= E[X1 X2 X3 X4 ]
u1 u2 u3 u4
u1 ,u2 ,u3 ,u4 =0

(1)

On the other hand, since X has jointly Gaussian distribution with zero mean,
1
X (u) = exp( uT Ku)
2
where K is the covariance matrix such that the ij-th element of K, denoted by Kij , satisfies
Kij = Cov(Xi , Xj ). Furthermore, Kij = Kji , and
Cov(Xi , Xj ) = E[Xi Xj ] E[Xi ]E[Xj ] = E[Xi Xj ]
Hence,



4
X (u)
u1 u2 u3 u4
u1 ,u2 ,u3 ,u4 =0


4
1 T
exp( 2 u Ku)
u1 u2 u3 u4
u1 ,u2 ,u3 ,u4 =0

= K12 K34 + K13 K24 + K14 K23


= E[X1 X2 ]E[X3 X4 ] + E[X1 X3 ]E[X2 X4 ] + E[X1 X4 ]E[X2 X3 ]
Combining (1) and (2), we have
E[X1 X2 X3 X4 ] = E[X1 X2 ]E[X3 X4 ] + E[X1 X3 ]E[X2 X4 ] + E[X1 X4 ]E[X2 X3 ]

(2)

ECE 534: Random Processes, Problem Set 4

2. Problem 2
(a) Since X and Y have jointly Gaussian distribution,

 X+ Y has Gaussian distribution with
1
5 1 1
mean [1 1]
= 2 and variance [1 1]
= 4. Hence,
1
1 1
1
4 2
P (X + Y > 4) = Q
= Q(1)
4

(b) Condition on the event that Y = y, X has Gaussian distribution with mean E[X|Y
= y]
and variance Cov(e), where

E[X|Y
= y] = E[X] + Cov(X, Y )Cov(Y, Y )1 (y E[Y ]) = 2 y
Cov(e) = Cov(X, X) Cov(X, Y )Cov(Y, Y )1 Cov(Y, X) = 4
Hence,
 (x + y 2)2 
1
fX|Y (x|y) = exp
8
8
(c)
E[X 2 |Y ] = Var(X|Y ) + (E[X|Y ])2 = 4 + (2 Y )2
(d)
P (X 2|Y = 2) = Q

 2 (2 y) 

= Q(1)
4

(e)




E[X 2 Y ] = E E[X 2 Y |Y ] = E Y E[X 2 |Y ] = E[4Y + (2 Y )2 Y ]
= E[Y 3 ] 4E[Y 2 ] + 8E[Y ] = 4
where E[Y 3 ] can be obtained from the moment generating function of Y . Hence,
Cov(X 2 , Y ) = E[X 2 Y ] E[X 2 ]E[Y ] = 2
Then,
2 |Y ] = E[X 2 ] + Cov(X 2 , Y )Cov(Y, Y )1 (Y E[Y ]) = 8 2Y
E[X
3. Problem 3
X (t)

= E[Xt ] = E[A cos(2V t + )]




= E[A] E[cos(2V t)]E[cos ] E[sin(2V t)]E[sin ]
= 0

RX (s, t) = E[Xs Xt ] = E[A2 cos(2V s + ) cos(2V t + )]



1
= E[A2 ] E[cos(2V (s + t) + 2)] + E[cos(2V (s t))]
2
1
=
E[A2 ]E[cos(2V (s t))]
2

2 sin(10(s t)) , if s 6= t
5(s t)
=

4,
if s = t

ECE 534: Random Processes, Problem Set 4

40

30

20

Xt

10

-10

-20

-30
-10

-8

-6

-4

-2

10

Figure 1: Three sample paths of X


Since X (t) does not depend on t, and RX (s, t) depends on s and t only through s t. Hence,
X is WSS.
4. Problem 4
(a) Since R and S have Rayleigh distributions, R 0 and S 0. Furthermore Xt = R St,
hence the set of all possible sample paths of X is the set of all lines with non-negative yintercept and non-positive slope. Fig. 1 depicts three sample paths of X.
(c) No. Since
Xt X0 = St = X2t Xt ,
therefore Xt X0 and X2t Xt are not independent.
(d) Since P (S = 0) = 0, we can ignore the event that S = 0. Then, y-intercept of the line is
equal to R and x-intercept of the line is equal to R
S . Hence,
h 1 R2 i 1
h1i
2
E[A] = E
= E[R ]E
2 S
2
S
where the second equality dues to the independency between R and S. Furthermore,
2
E[R2 ] = Var(R) + (E[R])2 = 2R

and

r
h 1 i Z 1 s s2

2
2
=
e s ds =
E
2
S
s s
2s2
0

Hence,
2
E[A] = R

2s2

5. Problem 5
(a)
P W3



W2 + W4
W3 W2 W4 W3
+1 =P

1
2
2
2

ECE 534: Random Processes, Problem Set 4

Since Wt is a standard Brownian motion, W3 W2 and W4 W3 are independent Gaussian


2
3
random variables with mean 0 and variance 1. Hence, W3 W
W4 W
has Gaussian
2
2
1
distribution with mean 0 and variance 2 . Then,
P


10
W3 W 2 W4 W3

1 = Q( q ) = Q( 2)
2
2
1
2

(b) Since Wt has Gaussian distribution N (0, t),

W
t
t

dard normal distribution), for all t 1. Then,

has Gaussian distribution N (0, 1) (stanWt2


t

has Chi-squared distribution with 1

degree of freedom, denoted by 2 (1), for all t 1. Hence,

Wt2 d.
t

2 (1).

6. Problem 6
P (N1 1, N2 = 2)
= P (N1 = 1, N2 = 2) + P (N1 = 2, N2 = 2)
= P (N1 N0 = 1)P (N2 N1 = 1) + P (N1 N0 = 2)P (N2 N1 = 0)
= e e +
=

2
e
2 e

32 2
2 e

Furthermore,
P (N1 1) = 1 P (N1 = 0) = 1 P (N1 N0 = 0) = 1 e
and
P (N2 = 2) = P (N2 N0 = 2) =

(2)2 2
e
= 22 e2
2!

(a)
P (N1 1|N2 = 2) =

P (N1 1, N2 = 2)
3
=
P (N2 = 2)
4

(b)
P (N2 = 2|N1 1) =

32 e2
P (N1 1, N2 = 2)
=
P (N1 1)
2(1 e )

7. Problem 7
(a) False. Since
X2 X1.5 = 2 cos() = X1 X0.5 ,
therefore X2 X1.5 and X1 X0.5 are not independent.
(b) True. Since (Xt ) and (Yt ) are independent,
Z (t) = E[Zt ] = E[Xt ]E[Yt ] = X (t)Y (t)
and
RZ (s, t) = E[Zs Zt ] = E[Xs Xt ]E[Ys Yt ] = RX (s, t)RY (s, t)
Furthermore, since (Xt ) and (Yt ) are WSS, X (t), Y (t) are constant and RX (s, t),
RY (s, t) depend on s and t only through s t. Hence, Z (t) is constant and RZ (s, t)
depends on s and t only through s t, which implies that (Zt ) is also WSS.

ECE 534: Random Processes, Problem Set 4

(c) False. A standard Brownian motion is a martingale (since P (X0 = 0) = 1 and (Xt ) has
independent increments). However, a standard Brownian motion is not WSS (RX (s, t) =
2 min{s, t} =
6 2 min{s + k, t + k} = RX (s + k, t + k)).
8. Problem 8
(a) The characteristic function of Yk , denoted by Yk (u), can be computed as
Yk (u) = E[ejuYk ] = E[eju(Xk +Xk+1 ) ] = E[ejuXk ]E[ejuXk+1 ]
= exp((eju 1)) exp((eju 1)) = exp(2(eju 1))
which is the characteristic function of Poisson distribution with parameter 2.
(b) Proof. Given any t1 , t2 , . . . , tn and s Z,
FX,n (x1 , t1 ; x2 , t2 ; . . . ; xn , tn )
= FXt1 (x1 )FXt2 (x2 ) FXtn (xn )
= FXt1 +s (x1 )FXt2 +s (x2 ) FXtn +s (xn )
= FX,n (x1 , t1 + s; x2 , t2 + s; . . . ; xn , tn + s)
where the first and the third equality hold since {Xk } are independent, and the second
equality holds since {Xk } are identically distributed.
(c) Yes.
Proof. Given any t1 , t2 , . . . , tn and s Z, there exists function G such that
(Yt1 , . . . , Ytn ) = G(Xt1 , Xt1 +1 , . . . , Xtn , Xtn +1 )
and
(Yt1 +s , . . . , Ytn +s ) = G(Xt1 +s , Xt1 +s+1 , . . . , Xtn +s , Xtn +s+1 )
Furthermore, since X is stationary, (Xt1 , Xt1 +1 , . . . , Xtn , Xtn +1 ) and (Xt1 +s , Xt1 +s+1
, . . . , Xtn +s , Xtn +s+1 ) have the same distribution. Hence, (Yt1 , . . . , Ytn ) and (Yt1 +s , . . . ,
Ytn +s ) have the same distribution, which implies that Y is also stationary.

You might also like