You are on page 1of 4

Assignment 7

Q.1: Let fX (x) be given as


fX (x) = Kex
)T

u(x)

(1)

)T ,

Where = (1 , ..., n with i > 0 all i, x = (x1 , ...., xn


u(x) = 1 if xi 0, i = 1, ..., n and zero otherwise,
and K is a constant to be determined. What value of K will enable fX (x) to be a pdf? [S & W, 5.1]
Q.2: Show that the two random variables X1 and X2 with joint pdf
1
|x1 | < 4, 2 x2 4
fX,Y (x1 , x2 ) = 16
0
otherwise

(2)

are independent and orthogonal [S & W, 5.5].


Q.3: Let Xi , i = 1, ..., n be n mutually uncorrelated random vectors with means i = E[Xi ]. Show that [S &
W, 5.7]

2
n
n
X

X


E[ (Xi i ) ] =
E[k(Xi i )k2 ]


i=1

(3)

i=1

Q.4: Let Xi , i = 1, ..., n be n mutually uncorrelated random variables with E[Xi ] = i , i = 1, ..., n. Show that
E[

n
n
n
X
X
X
(Xi i )
(Xj j )T ] =
Ki
i=1

where Ki = E[(Xi i )(Xi i

)T ].

j=1

(4)

i=1

[S & W, 5.8]

Q.5: Explain why none of the following matrices can be covariance matrices associated with real random
vectors [S & W, 5.9]

2 4 0
4 0 0
6
1+j 2
4 6 2
5
1 (d) 6 9 3
(a) 4 3 1 (b) 0 6 0 (c) 1 j
0
1 2
0 0 2
2
1
6
9 12 16
Q.6: Let X = (X1 , X2 , X3 )T be a random vector with = E[X] given by = (5, 5, 6)T and covariance given
by

5 2 1
(5)
K= 2 5 0
1 0 4
Calculate the mean and variance of Y = AT X + B, where A = (2, 1, 2)T and B = 5 [S & W, 5.15].
Q.7: Show that if X = (X1 , ...., Xn )T has mean = (1 , ...., n )T and covariance K = {Kij }nn , then the
scalar random variable Y , given by Y = p1 X1 + ..... + pn Xn has mean and variance as [S & W, 5.17]
E[Y ] =

n
X
i=1

pi i

(6)

Y2

n X
n
X

pi pj Kij

(7)

i=1 j=1

Q.8: Let X = (X1 , ...., X4 ) be a Gaussian random vector with E[X] = 0. Show that [S & W, 5.20]
E[X1 X2 X3 X4 ] = K12 K34 + K13 K24 + K14 K23

(8)

Q.9: Let the joint pdf of X1 , X2 , X3 be given by fX = 23 (x1 +x2 +x3 over the region S = {(x1 , x2 , x3 ) : 0 < xi 1},
i = 1, 2, 3 and zero elsewhere. Compute the covariance matrix and show that the random variables X1 , X2 , X3 ,
although not independent, are essentially uncorrelated. [S & W, 5.21]
Q.10: The random variables X and Y have the joint pdf

2(y + x) 0 x y 1
fX,Y (x, y) =
0
otherwise

(9)

(a) What is fX/Y (x/y), the conditional PDF of X given Y = y? (b) What is x
M (y), the minimum mean
square error estimate of X given Y = y? (c) What is fY /X (y/x), the conditional PDF of Y given X = x? (d)
What is yM (x), the minimum mean square error estimate of Y given X = x? [Y & G, Q-9.1]
Q.11: A telemetry signal, T, transmitted from a temperature sensor on a communications satellite is a Gaussian
random variable with E[T ] = 0 and V ar[T ] = 9. The receiver at mission control receives R = T + X, where
X is a noise voltage independent of T with PDF
1
3 x 3
fX (x) = 6
(10)
0 otherwise
The receiver uses R to calculate a linear estimate of the telemetry voltage: tL (r) = ar + b. (a) What is
E[R], the expected value of the received voltage? (b) What is Var[R], the variance of the received voltage?
(c) What is Cov[T, R], the covariance of the transmitted voltage and the received voltage? (d) What is the
correlation coefficent T,R of T and R? (e) What are a and b , the optimum mean square values of a and b
in the linear estimator? (f) What is eL , the minimum mean square error of the linear estimate? [Y & G, Q-9.2]
Q.12: A receiver at a radial distance R from a radio beacon measures the beacon power to be X = Y 40
40log10 R dB. where Y called the shadow fading factor, is a Gaussian (0, 8) random variable that is independent
of R. When the receiver is equally likely to be at any point within a 1000 m radius circle around the beacon,
the distance R has PDF
 2r
0 r 1000
fR (r) = 106
(11)
0
otherwise
Find the maximum likelihood (ML) and maximum aposteriori (MAP) estimates of R given the observation
X = x. [Y & G, Q-9.3]
Q.13: X and Y have the joint PDF

fX,Y (x, y) =

6(y x) 0 x y 1
0
otherwise

(12)

(a) What is fX (x)? (b) What is blind estimate x


B ? (c) What is the minimum mean square error estimate of
X given X < 0.5? (d) What is fY (y)? (e) What is blind estimate yB ? (e) What is the minimum mean square

PX,Y (x, y)
x=-1
x=0
x=1

Table 1:
y=-3
1/6
1/12
0

Table
y=-1
1/8
1/12
1/24

1
y=1
1/24
1/12
1/8

y=3
0
1/12
1/6

error estimate of Y given Y > 0.5? [Y & G, 9.1.2]


Q.14: X and Y have the joint PDF

fX,Y (x, y) =

6(y x) 0 x y 1
0
otherwise

(13)

(a) What is fX/Y (x/y)? (b) What is x


M (y), the minimum mean square error estimate of X given Y = y? (c)
What is fY /X (y/x)? (d) What is yM (x), the minimum mean square error estimate of Y given X = x? [Y &
G, 9.1.4]
Q.15: X and Y have the joint PDF

fX,Y (x, y) =

2 0xy1
0
otherwise

(14)

(a) What is fX/Y (x/y)? (b) What is x


M (y), the minimum mean square error estimate of X given Y = y? (c)

2
What is e (0.5) = E[(X x
M (0.5)) /Y = 0.5], the minimum mean square error estimate of X given Y = 0.5?
[Y & G, 9.1.5]
Q.16: The random variables X and Y have the joint PDF

2(y + x) 0 x y 1
fX,Y (x, y) =
0
otherwise

(15)

L (y) the minimum mean square error estimate of X given Y ? [Y & G, 9.2.5]
(a) What is X
Q.19: The following table (Table 1) gives PX,Y (x, y), the joint probability mass function (PMF) of random
variables X and Y . (a) Find the marginal PMFs PX (x) and PY (y). (b) Are X and Y independent? (c) Find
) = aY + b be a linear estimator of X. Find a and
E[X], Var[X], E[Y], Var[Y], and Cov[X, Y]. (d) Let X(Y
b , the values of a and b that minimize the mean square error eL . (e) What is eL , the minimum mean square
error of the optimum linear estimate? (f) Find PX/Y (x/ 3), the conditional PMF of X given Y = 3. (g)
Find x
M (3), the optimum (nonlinear) mean square error of X given Y=-3? (h) What is
e (3) = E[(X x
M (3))2 /Y = 3]

(16)

the mean square error of this estimate? [Y & G, 9.2.1]


Q.17: Let R be an exponential random variable with expected value 1 . If R = r, then over an interval of
length T the number of phone calls N that arrive at a telephone switch has a Poisson PMF with expected
value rT . (a) Find the MMSE estimate of R given N . (b) Find the MAP estimate of R given N . (c) Find the
ML estimate of R given N . [Y & G, 9.3.3]
Q.18: For a certain coin, Q, is a uniform (0,1) random variable. Given Q = q, each flip is heads with probability
q, independent of any other flip. Suppose this coin is flipped n times. Let K denote the number of heads in n
3

flips. (a) What is the maximum likelihood estimator of Q given K? (b) What is the PMF of K? (c) What is
the conditional PDF fQ/K (q/k)? (d) Find the MMSE estimator of Q given K = k. [Y & G, 9.3.4]
Q.19: Elevators arrive randomly at the ground floor of an office building. Becuase of a large crowd, a person
will wait for time W in order to board the third arriving elevator. Let X1 denote the time (in second) until
the first elevator arrives and let Xi denote the time between the arrival of elevator i 1 and i. Suppose X1 ,
X2 , X3 are independent uniform (0, 30) random variables. Find upper bounds to the probability W exceeds
75 seconds using (a) the Markov inequality, (b) the Chebyshev inequality. [Y & G, Q 7.2]

You might also like