You are on page 1of 7

ENEE621: HW Assignment #1

Spring 2013

1. (Exercise 4 from Poor) Suppose that the probability density function (PDF) of observation Y under hypothesis H0 is given by y e , y 0, p0 (y ) = 0, y < 0, and the PDF under hypothesis H1 is p1 (y ) = ( q
2 y2 e
2

0,

, y 0, y < 0.

(a) Find a Bayes rule and the minimum Bayes risk for testing H0 versus H1 with uniform costs and equal priors. (b) Find a minimax rule and the minimax risk for uniform costs. (c) Find a Neyman-Pearson rule and the corresponding detection probability for false-alarm probability (0, 1). Ans: (a) Bayes rule: 2 y 2 +y e 2 1 H1 p p Therefore, we will decide H1 when A = 1 1 log 1 log 2 y 1+ 2 = B . The Bayes risk can be calculated as R( ) = where P0 (1 ) = and P1 (0 ) = Z
A

p1 (y ) = p0 (y )

1 (P0 (1 ) + P1 (0 )) 2
B

ey dy = eA eB r

2 y2 e 2 dy +

2 y2 e 2 dy = 1 2Q(A) + 2Q(B ).

(b) Minimax rule: Bayes rule for general 0 is r p1 (y ) 2 y 2 +y 0 = e 2 H1 . p0 (y ) 1 0 p 0 1 If 1 2 log , then, we will decide that H1 is true when 2 10 0, i.e., 0
1+
2e

A =1

1 2 log

0 y 1+ 2 1 0 1

1 2 log

0 = B0. 2 1 0

Therefore, referring to the results from the above problem, we can calculate that V (0 ) = 0 (eA eB ) + (1 0 ) (1 2Q(A0 ) + 2Q(B 0 )). If 0 >
1+ 1 , we will always decide that H0 is true. Then, V (0 ) = 1 0 .
2e 0 0 0 0

V (0 ) will have the maximum value where 0 satises that eA eB = 1 2Q(A0 ) + 2Q(B 0 . 0 0 Let us set L is the maximum point of V (0 ). Then, the minimax risk is equal to eA eB when 0 = L . (c)NP-rule: We will decide that H1 is true when r p1 (y ) 2 y 2 +y = e 2 . p0 (y ) PF = 0. q p 2e If 1 2 log 0 , i.e., 2 , then, we will decide that H1 is true when We know that if >
2e ,

A =1

1 2 log

y 1+ 2

1 2 log

= B. 2

Therefore, false alarm probability is PF = = eA eB , which gives us that = q


2

1 2

1 log

e+

e2 2 +4 2

2 !

. And, the detection probability is

PD = 1 2Q(A ) + 2Q(B ) and B = 1 + log e+


e2 2 +4 . 2

where A = 1 log e+

e2 2 +4 2

2. (Exercise 9 from Poor) Suppose that we have a real observation Y and binary hypothesis described by the following pair of PDFs: 1 |y |, if |y | 1, p0 (y ) = 0, if |y | > 1 and p1 (y ) = (a) Assume that the costs are given by C01 = 2C10 > 0 and C00 = C11 = 0. Find a minimax test of H0 versus H1 and the corresponding minimax risk. 2 (
2 |y | 4 ,

0,

if |y | 2, if |y | > 2,

(b) Find a Neyman-Pearson test of H0 versus H1 with false-alarm probability and the corresponding detection probability. Ans: (a) Bayes rule for a prior 0 : First, when |y | > 1, we decide H1 is true from the given conditional distributions. When |y | 1, if p1 (y ) 2 |y | 0 = p0 (y ) 4(1 |y |) 2(1 0 ) (30 1)|y | 2(20 1),

we decide H1 is true. This can be separated as following three cases. At rst case, 1 40 2 0 ( , 1] : |y | = a1 H1 2 30 1 P0 (1 ) = 2 P1 (0 ) = 2 Z Z
1

a1 a1 0

(1 y )dy =

V (0 ) = 0 C10 P0 (1 ) + (1 0 )C01 P1 (0 ) ! 1 0 2 (40 1)(20 1) = C10 0 + 2(1 0 ) 30 1 (30 1)2 Secondly, 1 40 2 0 [0, ) : |y | H1 3 30 1

1 0 2 30 1 2y (40 1)(20 1) dy = 4 (30 1)2

Here,

4 0 2 3 0 1

1 for this range of 0 , which means that for all |y | 1 we will decide H1 . P0 (1 ) = 1 P1 (0 ) = 0 V (0 ) = 0 C10 P0 (1 ) + (1 0 )C01 P1 (0 ) = 0 C10

At third, 1 1 0 ( , ) : Always H1 3 2 P0 (1 ) = 1 P1 (0 ) = 0 V (0 ) = 0 C10 P0 (1 ) + (1 0 )C01 P1 (0 ) = 0 C10 Finally, if you plot the graph of V (0 ), you can check that the maximum of V (0 ) is achieved when 5+ 10 0 = 15 (which satises that P0 (1 ) = 2P1 (0 ) at the rst case). Given the 0 , the minimax risk 2 101 is C10 . 3 3

(b) Let us dene 0 is the threshold where we decide H0 (H1 ) when |y | (|y | ). Then, the false alarm probability is PF = (1 )2 = , which gives us that = 1 . And the corresponding 2 )2 detection probability is PD = (2 = (1+ 4 ) . 4 3. (Exercise 12 from Poor) Consider a simple binary hypothesis testing problem. For a decision rule , denote the false-alarm and miss probability by PF ( ) and PM ( ), respectively. Consider the performance measure ( ) = (PF ( ))2 + (PM ( ))2 and let o denote a decision rule minimizing ( ) over all randomized decision rules . (a) Show that o must be a likelihood-ratio test. (b) For 0 [0, 1], dene the function V by V (0 ) = min [0 PF ( ) + (1 0 )PM ( )] .

Suppose that V (0 ) achieves its maximum on [0, 1] at the point 0 = 0.5. Show that o is a Bayes rule for prior 0 = 0.5. Ans: (a) The performance measure can be rewritten as ( ) = (PF ( ))2 + (1 PD ( ))2 . Let us suppose that the decision rule 0 is not a likelihood-ratio test. And, lets dene that PF (0 ) = . Then, from the Neyman-Pearson Lemma, we know that we can always nd the likelihood ratio test, 0 , which satises that PD (0 ) PD ( 0 ) when PF ( 0 ) = . Then, obviously, we have that (0 ) ( 0 ), which is a contradiction that 0 gives us the minimum ( ). Therefore, the 0 is a likelihood-ratio test. (b)(with uniform cost and differentiable V (0 )) Lets suppose that o is a Bayes rule for prior 0 = 0.5. Then, we will show that 0 = arg min ( ). Since o is a Bayes rule for prior 0 = 0.5, 1 1 0 = arg min PF ( ) + PM ( ) 2 2 = arg min(PF ( ) + PM ( ))2

From the minimax test theory, we know that PF (0 ) = PD (0 ) when 0 = 0.5. Therefore, 0 = arg min (PF ( ) + PM ( ))2 + (PF ( ) PM ( ))2

= arg min ( ). 4. (Exercise 2.1 from Levy) In the binary communication system shown in Figure 1, the message values X = 0 and X = 1 occur with a priori probabilities 1/4 and 3/4, respectively. The random variable V takes the values -1, 0, and 1 with probabilities 1/8, 3/4, and 1/8, respectively. The received message is Y =X +V. (a) Given the received signal Y , the receiver must decide if the transmitted message was 0 or 1. takes the values 0 or 1. Find the receiver that achieves the maximum The estimated message X probability of a correct decision. 4

Figure 1: Binary communication system model. (b) Find h i = X for the receiver of part (a). X

Ans: On the next pages.

5. (Exercise 2.2 from Levy) The system shown in Figure 2 is an idealized model for binary communication through a fading channel. The message X that we want to transmit takes one of two values, 0 or 1, with a priori probabilities 1/3 and 2/3, respectively. The channel fading A is a N (1, 3) Gaussian random variable. The channel noise V is a N (0, 1) Gaussian random variable.

Figure 2: Idealized model of a fading communication channel. (a) Find the minimum probability of error decision rule. Simplify your answer as much as possible (b) Sketch the decision regions on y -axis plot. Ans: On the next pages.

You might also like