You are on page 1of 6

ECE 534: Random Processes

Spring 2016

Problem Set 1
Author: Xiaobin Gao

Due Date: 2/3

1. Problem 1
(a) Sample space = {HH, HT, T H, T T }.
(b) Event space
F = 2 =

n
, {HH}, {HT }, {T H}, {T T }
{HH, HT }, {HH, T H}, {HH, T T }, {HT, T H}, {HT, T T }, {T H, T T }
{HH, HT, T H}, {HH, HT, T T }, {HH, T H, T T }, {HT, T H, T T }
o
{HH, HT, T H, T T }

(c) Probability measure

P () = 0,
P ({HH}) = P (first toss is head) P (second toss is head) =
P ({HH, HT }) = P ({HH}) + P ({HT }) =
P ({HH, HT, T H}) = 1 P ({T T }) = 1

3
4

34 , etc.

9
3
16 + 16 , etc.
1
16 , etc.

P ({HH, HT, T H, T T }) = 1
Note that
P (first toss is head) = P (first toss is head|fair coin is picked)

1
2

+P (first toss is head|biased coin is picked)

=

1
2

1
2

+1

1
2

1
2

3
4

(d) Since the outcomes of the first and the second tosses are independent,
P (first toss is head|second toss is tail) = P (first toss is head) =

3
4

2. Problem 2
(a) We use a four digit number, call it abcd, to represent case that packets 1, 2, 3, 4 are
routed to ports a, b, c, d, respectively, where a, b, c, d {1, 2, . . . , 8}. Then the sample
space
n
o
= abcd|a, b, c, d {1, 2, . . . , 8}
The event space is the power set of sample space, i.e., F = 2 . For any event A F,
P (A) =
where card(A) is the cardinality of A.
1

card(A)
84

ECE 534: Random Processes, Problem Set 1

(b)
1
,
84
4
,
84
6
,
84
12
,
84
24
,
84

P (X1 = 4) =
P (X1 = 3, X2 = 1) =
P (X1 = 2, X2 = 2) =
P (X1 = 2, X2 = 1, X3 = 1) =
P (X1 = 1, X2 = 1, X3 = 1, X4 = 1) =

etc.
etc.
etc.
etc.
etc.

Note that the joint pmf of X1 , . . . , Xn follows multinomial distribution.

(c) Define Xip , i {1, 2, . . . , 8}, p {1, 2, 3, 4} be binary random variables such that

Xip =
0, otherwise
Then Xi =

P4

p=1 Xip ,

i {1, 2, . . . , 8}. Therefore,


P
P4
4
X
Cov(X1 , X2 ) = Cov
X
,
q=1 2q
p=1 1p
P4 P4
=
p=1
q=1 Cov(X1p , X2q )
When p 6= q, X1p and X2q are independent. Hence Cov(X1p , X2q ) = 0. When p = q,
X1p and X2p can not be 1 simultaneously, i.e., X1p X2p = 0. Hence, Cov(X1p , X2p ) =
1
E[X1p X2p ] E[X1p ] E[X2p ] = 0 81 18 = 64
. Then,
Cov(X1 , X2 ) =

1
16

(d)
P (Xi 1 for all i) =

8765
105
=
4
8
256

(e)
P (Xi 2 for all i) =

8
4

4! +

8
1

7
2

84

4
2

2
1

8
2

4
2


=

483
512

3. Problem 3
(a) All bytes are equally probable is equivalent to All bits have Bern( 12 ) distribution,
mutually independent. Hence,
P (A) =

1 1
1
=
2 2
4

(b) We calculate P (B) in an iterative way:

Initialization: P (b1 is odd) = P (b1 is even) =

1
2

Iteration: when k > 1,

P
P ( ki=1 bi is odd)
Pk1
P
P
= P ( ki=1 bi | k1
i=1 bi is odd) P ( i=1 bi is odd)
Pk1
P
P
+P ( ki=1 bi | k1
i=1 bi is even) P ( i=1 bi is even)
Pk1
P
= P (bk is even) P ( k1
i=1 bi is odd) + P (bk is odd) P ( i=1 bi is even)
=

1
2

1
2

1
2

1
2

1
2

Hence, P (B) = 12 .
(c)
P (B|A) = P

8
X

!
bi is odd|b1 = b2 = 1

=P

i=1

8
X

!
bi is odd

i=3

Similar to part (b), we have P (B|A) =

1
2.

(d)
P (A|B) =

1
P (B|A)P (A)
=
P (B)
4

4. Problem 4
(a) P (X > 3) = 1 P (X = 3) = 0.8.
P (X > 8, X > 5)
P (X > 8)
=
= 0 6= P (X > 3).
P (X > 5)
P (X > 5)
Hence, the distribution of X does not have memoryless property.

(b)

P (Y > 3) = P (more than three shots for a success)

= P (the first three shots for are missed) = (1 p)3
P (Y > 8|X > 5) =

P (Y > 8, Y > 5)
P (Y > 8)
(1 p)8
=
=
= (1 p)3 = P (Y > 3)
P (Y > 5)
P (Y > 5)
(1 p)5

Hence, the distribution of Y , which is geometric distribution, has memoryless property.

5. Problem 5
(a) E[W 2 ] = (E[W ])2 + Var(W ) =

2
.
2

1
6
+
3 2
where the first equality dues to the independence of U, V, W and the linearity of expectation.

(b) E[U V W + 3U 2 ] = E[U ] E[V ] E[W ] + 3E[U 2 ] =

(c)
Cov(2 + U, U V ) = Cov(2, U V ) + Cov(U, U V )
= 0 + E[U 2 V ] E[U ] E[U V ]
= E[U 2 ] E[V ] E[U ] E[U ] E[V ]
=

1
3

ECE 534: Random Processes, Problem Set 1

6. Problem 6
(a) The cdf of X, denoted by FX (x), is given as

0,

0.5
FX (x) =

2x

x<0
0x<1
1x<2
x2

Hence, P (X 0.8) = 0.5.

(b) The pdf of X, denoted by fX (x), can be expressed by
fX (x) =
where

1
1
1
+ (x)
2 {1<x<2} 2

1{} is indicator function and (x) is Dirac delta function. Then,

2

1
E[X] = 0 +
2

1
E[X ] = 0 +
2

x
1

1
3
dx =
2
4

(c)
2

Hence, Var(X) =

E[X 2 ]

E[X]2

x2

1
7
dx =
2
6

29
48 .

7. Problem 7
(a) X1 has Bernoulli distribution with parameter 61 . Hence, E[X1 ] = 16 , Var(X1 ) =

5
36 .

(b)
"
E[X] = E

n
X

#
Xi =

i=1

Var(X) = Cov(X, X) =

n
X

E[Xi ] =

i=1
n X
n
X

n
6

Cov(Xi , Xj )

i=1 j=1

Note that when i 6= j, Xi and Xj are independent. Hence, Cov(Xi , Xj ) = 0. When

5
i = j, Cov(Xi , Xj ) = Var(Xi ) = 36
. Hence,
Var(X) =

5n
36

Note that X has Binomial distribution with parameters (n, 61 ).

(c) When i 6= j, Xi and Yj are independent. Hence, Cov(Xi , Yj ) = 0. When i = j,
1
1
= 36
. (Note that Xi and Yi can not be
Cov(Xi , Yj ) = E[Xi Yi ] E[Xi ] E[Yi ] = 0 36
1 simultaneously, i.e. Xi Yi = 0)
(d) Cov(X, Y ) =

n
36
Cov(X, Y )
n
1
p
Cov(X
,
Y
)
=

=
=
i
j
XY
j=1
36
5n = 5 .
Var(X)Var(Y )
36

Pn Pn
i=1

ECE 534: Random Processes, Problem Set 1

(e) Let Zi , Oi , Pi , Qi be binary random variables, where 1 means 3, 4, 5, 6 shows on the i-th
roll, respectively. Define Z, O, P, Q as the sum of Zi , Oi , Pi , Qi , respectively. It is clear
that X + Y + Z + O + P + Q = n, hence
E[Y + Z + O + P + Q|X = x] = n x
By symmetry, the conditional distribution of Y given X = x is same as the conditional
distribution of Z given X = x, etc. Then,
E[Y |X = x] = E[Z|X = x] = E[O|X = x] = E[P |X = x] = E[Q|X = x]
which implies that E[Y |X = x] =

nx
5 .

8. Problem 8
(a)
P (X 1, Y 2) = FXY (1, 2) = 1 +


1 7
e e1 e6
4

(b)


1
FX (x) = lim FXY (x, y) = 1 ex 1{x0}
y
4


1 3y
1{y0}
FY (y) = lim FXY (x, y) = 1 e
x
4
(c) FXY (x, y) 6= FX (x) Fy (y). Hence, X and Y are not independent.
9. Problem 9
(a) The joint pdf of (X, Y ), denoted by fXY (x, y), is described as follows

12 ,
X2 + Y 2 a
a
fXY (x, y) =
0,
otherwise
When 0 r < a,
P (R r) = P (X 2 + Y 2 r2 ) =

Z
fXY (x, y) dxdy =
X 2 +Y 2 r2

Hence,

0,

2
r
FR (r) =
,

a2

1,

r<0
0r<a
r0

(b)

2r ,
d
2
fR (r) = FR (r) = a

dr
0,
10. Problem 10

0<r<a
otherwise

r2
a2

ECE 534: Random Processes, Problem Set 1

(a)
 P

P
X1 ,X2 ,X3 (u1 , u2 , u3 ) = exp j 3k=1 kuk 3k=1 k 2 u2k



= exp ju1 u21 exp 2ju2 4u22 exp 3ju3 9u23
Since the joint characteristic function factors, X1 , X2 , X3 are mutually independent, and
the characteristic function of X2 is given by

X2 (u2 ) = exp 2ju2 4u22
which is the characteristic function of Gaussian distribution N (2, 8).
(b) By part (a), E[X2 ] = 2, Var(X2 ) = 8.