You are on page 1of 25

19/12/16

Random Signals
Assumptions:

• Target signal s[n]: Random, zero-mean Gaussian random process with known covari-
ance.

• Noise w[n]: WGN with known 2


and independent from s[n].

Binary detection problem:


H0 x[n] = w[n]
H1 x[n] = s[n] + w[n]
The NP detector: Decides H1 if

p(x; H1 )
L(x) = >
p(x; H0 )

39

Random Signals - Example


2
H0 : x ⇠ N (0, I)
2 2
H1 : x ⇠ N (0, ( s+ )I)
 NP1
1 1
N exp 2( s2 + 2 )
x2 [n]
2 n=0
2⇡( s2 + 2 )
Thus, we have L(x) =  NP1
.
1 1
N exp 2 2
x2 [n]
( )
2⇡ 2 2 n=0
Calculating the Log-Likelihood Ratio (LLR ), we have
✓ 2
◆ ✓ ◆ NX1
N 1 1 1
l(x) = ln 2 2 2+ 2 2
x2 [n]
2 s+ 2 s n=0
! N
N 2
1 2 X1
s
= ln 2 2
+ 2( 2 + 2)
x2 [n]
2 s+ 2 s n=0
Notice: Scalar Wiener filter!
40

20
 The signal to be zero mean, white, WSS Gaussian random process with
a known covariance   

 The noise is assumed to be WGN with known variance   and to be


independent of the signal

 The NP detector computes the energy in the received data and


compares it to a threshold . Hence, it is known as energy detector
 If the signal is present, the energy of the received data increases
 The equivalent of the test statistic can be thought of as an estimator of
the variance

 The statistic becomes the sum of the squares of N IID Gaussian


random variables
 Comparing the statistic to a threshold recognizes that the variance
under  is   but under  it increases to    +   
Chi-Squared (Central)
 The chi-squared PDF arises as the PDF of x where x=∑   if  ~N(0,1)

 PDF for chi-squared random variable


 Since  ’s are independent and IID distributed with  ~N(0,1) with the
degrees of freedom ν, the mean and variances are:

E(x) = ν var(x) = 2 ν

 The right-tail probability for a chi-squared random variable is defined as

 It can be shown that for even ν

 for odd ν

 For our example we set the degree of freedom, ν  N, and IID data x =x[n]
Generalization to Random Signals with
Arbitrary Covariance Matrix
 We assume the signal s[n] is a zero mean, Gaussian random process
with covariance matrix 

 Noise w[n] is WGN with variance  

 Two hypothesis signal detection problem

 We have N dimensional observations where the observations are either WGN


or correlated signal embedded in WGN
19/12/16

Random Signals - Generalization



2 T 1 2 1 0
2
T (x) = x 2
I Cs + I x>2

Using the matrix inversion lemma

1 1 1 1 1 1 1
(A + BCD) =A A B(DA B+C ) DA

we can write
2 1
Cs + I

as ✓ ◆ 1
1 1 1 1
2
I 4
Cs + 2
I

such that " ✓ ◆ #


1
T 1 1 1 0
2
T (x) = x 2
Cs + 2
I x>2

45

Random Signals - Generalization



" ✓ ◆ #
1
1 1 0
T (x) = xT 2
Cs 1 + 2
I x>2 2

Now set ŝ equal to " #


✓ ◆ 1
1 1
ŝ = 2
Cs + 1
2
I x Recognize this as the LMMSE filter!!
(✓ = C✓x Cxx1 x)
which can be rewritten as
 1
1 1 2
ŝ = 2 2
Cs + I Cs 1 x = Cs (Cs + 2
I) 1
x

Hence, we decide for H1 if


00
T (x) = xT ŝ >
46

23
Conclusions
 The NP detector correlates the received data with an
estimate of the signal therefore termed an estimator-
correlator

 The test statistic is a quadratic form in the data and


thus will not be a Gaussian random variable
 The estimated signal is a Wiener filter estimator of the
random signal that yields the Minimum Mean Square
Error (MMSE) estimate of the signal realization

 MMSE
19/12/16

Random Signals – Estimator Correlator


Hence, we decide for H1 if
00
T (x) = xT ŝ >

with  1
1 1 2
ŝ = 2 2
Cs + I Cs 1 x = Cs (Cs + 2
I) 1
x

Fig. 5.2 Kay-II. 47

Summary (1)
Deterministic signals:
T (x) = xT C 1
s

Notice that is C is positive definite, C 1 can be written as C 1


= DT D, leading to

T (x) = xT DT Ds

Fig. 4.7 Kay-II. 48

24
What if the PDFs are not completely known?
 A radar signal that returns from a target will be
delayed by the propogation time of the signal
through the medium
 It is arrival time is generally unknown
 A communication receiver may not have perfect
knowledge of the frequency of a transmitted
signal
 The noise characteristics may not be known a
priori
 The noise is modeled as a white Gaussian noise but
with unknown variance
Detection: chapter 6

Statistical Detection Theory II


Natasha Devroye
devroye@ece.uic.edu
http://www.ece.uic.edu/~devroye

Spring 2010

So far, detection under:

• Neyman-Pearson criteria (max PD s.t. PFA = constant): likelihood ratio test,


threshold set by PFA

• minimize Bayesian risk (assign costs to decisions, have priors of the different
hypotheses): likelihood ratio test, threshold set by priors+costs

• minimum probability of error = maximum a posteriori detection

• maximum likelihood detection = minimum probability of error with equal priors

• known deterministic signals in Gaussian noise: correlators

• random signals: estimator-correlators, energy detectors

All assume knowledge of


Motivation

• What if don’t know the distribution of x under the two hypotheses?

• What if under hypothesis 0, distribution is in some set, and under hypothesis


1, this distribution lies in another set - can we distinguish between these two?

Composite hypothesis testing

Composite hypothesis testing summary


19/12/16

Unknown parameters?
Detection under NP:

Deterministic signals: NP requires perfect knowledge


00
T (x) = xT C 1
s> of p(x; H0 ) and p(x; H1 ).
Random Signals: (and thus s and/or Cs and 2
)
00
T
T (x) = x ŝ >
with What if this information is
 1
1 1
ŝ = Cs + 2
I Cs 1 x = Cs (Cs + 2
I) 1
x unknown?
2 2

51

Composite Hypothesis Testing


What if the value of A is unknown?

H0 : x[n] = w[n] n = 0, 1, . . . , N 1

H1 : x[n] = A + w[n] n = 0, 1, . . . , N 1

where A is unknown, but we know A > 0. Further we know that w[n] is WGN with variance
2
. NP detector decides H1 if
PDFs parameterized by A!
p(x; A, H1 ) This is called: Composite
L(x) = >
p(x; A, H0 ) Hypothesis testing.

52

26
  increases with increasing A
 We may say that over all possible detectors that
have a given  the one that yields the highest
 is specified by the uniformly most powerful
(UMP) test
 Any other test has a poorer performance

Dependence of probability of detection on unknown parameter A


19/12/16

Example – DC level in WGN unknown A


Let us first calculate the clairvoyant detector for the case 1<A<1

N 1
1 X 0
A>0: x[n] > +
N n=0
N 1
1 X 0
A<0: x[n] <
N n=0
For A > 0 0
!
0 +
PF A = P r{x̄ > + ; H0 } =Q p
2 /N

For A < 0 0
! 0
!
0
PF A = P r{x̄ < ; H0 } = 1 Q p =Q p
2 /N 2 /N

55

Example – DC level in WGN unknown A


The detection performance of the clairvoyant detector:
0
! r !
0 + A N A2
for A > 0 PD = P r{x̄ > + ; H1 } =Q p =Q Q 1
(PF A ) 2
2 /N

0
! 0
! !
A +A A
for A < 0 PD = 1 Q p =Q p =Q Q 1
(PF A ) + p
2 /N 2 /N 2 /N

Fig. 6.3 Kay-II.

56

28
19/12/16

Approaches for composite Hyp. testing


Two approaches:

1. Bayesian approach: Consider unknown parameters as realizations of random vari-


ables and assign a prior pdf.

2. Generalized likelihood ratio: Estimate unknown parameters using MLEs.

1. Bayesian approach:
Assign priors to unknown parameters and 1 under hypotheses H0 and H1 , respectively:
0
Z
p(x; H0 ) = p(x| 0 ; H0 )p( 0 )d 0
Z • Need to choose prior pdf.
p(x; H1 ) = p(x| 1 ; H1 )p( 1 )d 1

R • Integration can be difficult.


NP detector: p(x;H1 )
= R p(x| 1 ;H1 )p( 1 )d 1 >
p(x;H0 ) p(x| 0 ;H0 )p( 0 )d 0
57

Generalized Likelihood Ratio Test

2. GLRT:

• Replace unknown parameters by their MLEs.

• GLRT:
p(x; ˆ1 , H1 )
LG (x) = >
p(x; ˆ0 , H0 )
with ˆi given by
ˆi = arg max p(x; i , Hi )
i

58

29
19/12/16

Example: DC in WGN with Unknown Amplitude - GRLT

H0 : x[n] = w[n] n = 0, 1, . . . , N 1

H1 : x[n] = A + w[n] n = 0, 1, . . . , N 1

where 1 < A < 1 and w[n] is WGN with variance 2


. NP detector decides H1 if the
GLRT:
p(x; ˆ1 , H1 )
LG (x) = >
p(x; ˆ0 , H0 )
with ˆi given by
ˆi = arg max p(x; i , Hi )
i

59

Example: DC in WGN with Unknown Amplitude - GRLT


MLE of A:
p(x; Â, H1 ) = max p(x; A, H1 )
A

1 h N 1
1 X i
= max N
exp 2
(x[n] A)2 .
A (2⇡ 2 2 ) 2 n=0
P
This will lead to  = 1
N x[n] = x̄.

Thus, the GLRT


h PN i
1 1 1
N exp 2 2 n=0 (x[n]x̄)2
(2⇡ 2 ) 2
LG (x) = h PN 1 2 i >
1 1
N exp 2 2 n=0 x [n]
(2⇡ 2 ) 2

Taking the logarithm of both sides we have

1 N x̄2
ln LG (x) = 2
( 2N x̄2 + N x̄2 ) =
2 2 2
60

30
19/12/16

Example: DC in WGN with Unknown Amplitude - GRLT


1 N x̄2
ln LG (x) = 2
( 2N x̄2 + N x̄2 ) =
2 2 2
We decide thus for H1 if
00
|x̄| > .

00
!
00 00
PF A = P r{|x̄| > ; H0 } = 2P r{x̄ > ; H0 } = 2Q p
2 /N

00 p
= 2 /N Q 1 (P
F A /2)

r ! !
00
1 N A2 1 A
PD = P r{|x̄| > ; H1 } = Q Q (PF A /2) 2
+Q Q (PF A /2) + p
2 /N

61

Example – DC level in WGN unknown A


r ! !
00
1 N A2 1 A
PD = P r{|x̄| > ; H1 } = Q Q (PF A /2) 2
+Q Q (PF A /2) + p
2 /N

The performance of this realisable


detector is thus not optimal, but close
to the optimal clairvoyant detector.

Fig. 6.4 Kay-II. 62

31

You might also like