You are on page 1of 2

EE585 Random Processes in Physical Systems Winter 2009

Muhammad Tahir Feb 13, 2010


Homework 3 - Solution
Problem 1:
Due to the independence of N and X
i
i we have E[Y ] = E[N]E[X] =
1

1
p
. For the variance we have
var(Y ) = E[var(Y |N)] + var(E[Y |N])
= E[N]var(X) + (E[X])
2
var(N)
=
1

2
1
p
+
1

2
1 p
p
2
=
1

2
1
p
2
.
Problem 2: (a)
X
i,n
(u) = E[e
juX
i,n
] = 1 +

n
(e
ju
1) so
Y
n
(u) =
_
1 +

n
(e
ju
1)
_
n
.
(b) Since lim
n
_
1 +
x
n
_
n
= e
x
it leads to lim
n

Y
n
(u) = e
(e
ju
1)
. This limit as a function of u is the
characteristic function of a random variable Y with the Poisson distribution with mean .
(c) Thus, (Y
k
) converges in distribution, and the limiting distribution is the Poisson distribution with mean .
There is not enough information given in the problem to determine whether Y
n
converges in any of the stronger
senses (p., m.s., or a.s.), because the given information only describes the distribution of Y
n
for each n but
gives nothing about the joint distribution of the Y
n
s. Note that Y
n
has a binomial distribution for each n.
Problem 4: Clearly Z(t) is zero-mean since both X(t) and Y (t) are zero-mean. The correlation function is
given by
R
ZZ
(t + , t) = 2E
_
(X(t + ) cos(2f
0
(t + )) Y (t + ) sin(2f
0
(t + )))
(X(t) cos(2f
0
t) Y (t) sin(2f
0
t))

= E
_
X(t + )X(t) 2 cos(2f
0
(t + )) cos(2f
0
t)
X(t + )Y (t) 2 cos(2f
0
(t + )) sin(2f
0
t)
Y (t + )X(t) 2 sin(2f
0
(t + )) cos(2f
0
t)
+Y (t + )Y (t) 2 sin(2f
0
(t + )) sin(2f
0
t)

.
= +R
XX
()
_
cos(4f
0
t + 2f
0
) + cos(2f
0
)
_
R
XY
()
_
sin(4f
0
t + 2f
0
) sin(2f
0
)
_
R
Y X
()
_
sin(4f
0
t + 2f
0
) + sin(2f
0
)
_
+R
Y Y
()
_
cos(2f
0
) cos(4f
0
t + 2f
0
)
_
To make R
ZZ
(t +, t) independent of t we use the fact that R
XY
() = R
Y X
() and need R
XX
() = R
Y Y
()
leading to
R
ZZ
() = 2R
XX
() cos(2f
0
) + 2R
Y Y
() sin(2f
0
)
Problem 3: (a) Since A and B are iid Gaussian hence they are jointly Gaussian.
(b) Now n and for all (t
1
, ..., t
n
) we construct random vector X as
X =
_

_
X(t
1
)
X(t
2
)
.
.
.
X(t
n
)
_

_
=
_

_
cos(2f
0
t
1
) sin(2f
0
t
1
)
cos(2f
0
t
2
) sin(2f
0
t
2
)
.
.
.
.
.
.
cos(2f
0
t
n
) sin(2f
0
t
n
)
_

_
_
A
B
_
Since n the random vector [X(t
1
) X(t
2
) X(t
n
)]
t
is jointly Gaussian and hence X is Gaussian.
(c) V
1
is linear functional of Gaussian process hence it is Gaussian and
E[V
1
] =
_

0
e
t
E[N(t)]dt = 0
var(V
1
) = E[V
2
1
] =
_

0
_

0
e
s
e
t
E[N(t)N(s)]dtds
=
_

0
_

0
e
(t+s)
R
N
(t s)dtds
Dening new variables u = t s and v = t + s we have 2dtds = dudv and
var(V
1
) =
1
2
_

u=
_

v=|u|
e
v
R
N
(u)dvdu
=
1
2
_

u=
e
|u|
R
N
(u)du
=
_

u=0
e
u
R
N
(u)du
To evaluate R
N
(u) we proceed as follows
R
N
() = E[X(t + , t)X(t)] = cos(2f
0
).
The above expression is obtained using the fact that E[A] = E[B] = 0. Using the expression for R
N
() we
have
var(V
1
) =
1
2
_

u=
e
|u|
R
N
(u)du
=
_

u=
1
2
e
|u|
cos(2f
0
)du
=
_

f=
1
1 + (2f)
2

1
2
((f f
0
) + (f + f
0
))df
=
1
1 + (2f
0
)
2
Hence V
1
N
_
0,
1
1+(2f
0
)
2
_
. Now for V
2
= UV
1
we have
P{V
2
x} =
1
2
P{V
1
x} +
1
2
P{V
1
x}
=
1
2
P{V
1
x} +
1
2
P{V
1
x}
= P{V
1
x}
Hence V
2
is also Gaussian and we have V
2
N
_
0,
1
1+(2f
0
)
2
_
. To determine whether V
1
and V
2
are jointly
Gaussian let us construct their linear combination as Y = V
1
+V
2
= V
1
(1 +U) and density of Y is determined
as follows
P{Y y} = P{V
1
(1 + U) y}
=
1
2
P{0 y} +
1
2
P{2V
1
y}
=
1
2
P{0 y} +
1
2
P{V
1
y/2}
The distribution of Y clearly shows that V
1
and V
2
are not jointly Gaussian.

You might also like