You are on page 1of 7

Power Spectrum Estimation

If xa(t) is a finite energy signal, then


E

x a (t )

dt

The Fourier transform is


X a (F )

(t )e j 2Ft dt

From Parsevals theorem


E

xa (t ) dt

xa ( F ) dF

where
S xx ( F ) X a ( F )

is the energy density spectrum and can be defined as the FT of


autocorrelation function (ACF) as
S ( F ) R ( )e
d where

xx

j 2Ft

xx

Rxx ( )

*
a

(t ) xa (t ) dt

Similarly, for DT signal x(n), - < n <


X ( f ) x ( n) e

j 2 fn

S xx ( f )
rxx ( k )

xx

( k ) e j 2kf

with

( n) x ( n k )

Two methods are used to calculate the energy density spectrum:


Direct method:
S xx ( f ) X ( f )

x(n) e j 2fn

Indirect Method: First calculate the ACF and then find the spectrum
by
S ( f ) r (k ) e

xx

xx

j 2kf

Power Spectrum Density (PSD): Periodogram


If x(t) is a stationary random process, its ACF can be found by
statistical average as,
xx ( ) E x * (t ) x(t )

xx ( F )

xx

( ) e j 2F d

xx ( ) lim Rxx ( ) lim


T0
T0

Pxx ( F )

Rxx ( ) e

j 2F

(t ) x(t )dt

1
d
2T0

N 1

1
N

T0 T
0

T0

Pxx ( f )

T0

x(n) e j 2fn

T0

x(t ) e

j 2Ft

dt

For DT signal,

T0

n0

This equation is known as periodogram.

Nonparametric Methods
Makes no assumption about how data were generated
Bartlett Method: averaging periodograms. Three steps
1. N-point data sequence is subdivided into K non-overlapping
segments with each segment length of M. So, there are K data
segments
xi ( n) x (n iM )

i 0,1, 2, ... K 1
n 0,1, 2, ... M 1

2. Compute periodogram for each segment


Pxx( i ) ( f )

1
M

M 1

x(n) e j 2fn
n 0

i 0,1, ... K 1 3.

Average the periodograms to

find PSD
PxxB ( f )

1
K

K 1

P
i 0

(i )
xx

(f)

Welch Method: averaging


modifications to Bartlett method

modified

periodograms.

Two

1. Allow data segments to overlap


xi (n) x(n iD )

n 0,1, 2, ... M 1
i 0,1, 2, ... L 1

Here, iD is the starting point of ith segment. If D = M, then no


overlap and L = K. If D = M/2, 50% overlap between successive
segments and we have L = 2K segments. Alternatively, we have K
data segments each of 2M length.
2. Modify the periodogram by windowing the data segment
2

1
P (f)
MU
(i )
xx

M 1

x(n) w(n)e

i 0,1, .. L 1 where

j 2fn

n 0

is

the

normalization factor of the window


U

1
M

M 1

( n)

n0

PxxW ( f )

1 L 1 ( i )
Pxx ( f )
L i 0

Blackman-Tukey Method: smoothing the periodogram. Sample


autocorrelation sequence is windowed first and then Fourier
transformed.
PxxBT ( f )

M 1

xx
m ( M 1)

( m) w( m)e j 2fm

w(m) has the length of 2M-1 and is zero for |m| M.


Advantages of Nonparametric methods:
simple, well-understood, easy to compute
Limitations:
Long data records required
Spectral leakage
Assume periodicity of data, may not be valid always

Parametric Methods

No assumption is required
Extrapolates values of autocorrelation for lags m N (possible if
some a priori information on data generation is available)
Model can be constructed with a number of parameters
estimated from observed data
Eliminates the need of window function
Better resolution due to absence of spectral leakage
ARMA AR and MA
AR model is most widely used due to 2 advantages
1. Suitable for representing spectra with narrow peaks (resonances)
2. Results in very simple equations for AR parameters
AR Parameter Estimation
3

Normally used methods are


Yule-Walker method
Burg method
Covariance method
Modified covariance method
Unconstrained least square method
Sequential estimation method
In Yule-Walker and Unconstrained least square methods, we simply
calculate the autocorrelation from the data and use it to solve for the
AR parameters. Sequential estimation method is used where data are
available on a continuous basis and the estimates can be updated as
new data points become available. On the other hand, the Burg
method estimates the reflection coefficients from the data and uses
Levinson recursion.
Yule-Walker Method: Estimate ACF from data and use the following
to solve for AR parameters

x (0) x (1). x (P1) a1 x (1)


(1) (0). (P2) a (2)
x x x 2 x

. . . . . .


x (P1)x (P2). x (0) aP x P)(
xx ( m)

1
N

N 1

x ( n) x ( n m)
*

m0

PxxYW ( f )

n 0

2
wp
xx (0) 1 ak ( k )
k 1

2
wp
P

1 a P ( k )e

2
j 2fk

k 1

Burg Method: Burg method may be viewed as an order recursive


least square lattice method based on the minimization of forward and
backward errors in linear predictors, with the constraint that the AR
parameters satisfy the Levinson recursion. The parameters are
calculated from the reflection coefficients k1, k2, . . . , kP.
P

x (n) a P [i ]x (n i )
i 1

The autocorrelation at 0 lag is,


rxx (0)

1
N

N 1

x(n)

n 0

The first AR parameter is

a P [1] k1 .

For k = 2, 3, , P

a k [i ] a k 1 [i ] k k a k 1 [ k i ]

and
k

a k [i ] k k

i 1, 2, ...., k 1

for i k

kf kb
2
k

x (n) a k [i ]x (n i )
i 1

x (n k ) a k [i ]x(n k i )
i 1

ekf (n) x(n) x (n) x (n) a k [i ]x (n i )


i 1

ekb ( n) x(n k ) x (n k ) x( n k ) a k [i ] x( n k i) kf (n)


i 1

kb (n)

2
1 N 1 f
ek ( n)

N k nk

N 1

2
1
ekb (n)

N k nk

ekf (n) ekf1 (n) k k ekb1 (n 1)


ekb ( n) ekb1 (n 1) k k ekf1 ( n)
e0f (n) e0b (n) x(n)

N 1
1
e f (n) k e b ( n 1) 2 e b (n 1) k e f (n) 2

k 1
k k 1
k 1
k k 1

2( N k ) n k

k with respect to kk and solving for kk, we obtain,


N 1

k k 2 N 1


nk

PxxBU ( f )

e
nk

f
k 1

(n) ekb1 (n 1)

2
2
ekf1 (n) ekb1 (n 1)

ePf ePb
P

1 a P ( k )e j 2fk

k 1

Comparison of methods

Differentiating

Burg AR
Estimator
Characteristics Does not apply
window to data

Minimizes the
forward and
backward
prediction errors
in the least squares
sense, with the AR
coefficients
constrained to
satisfy the L-D
recursion
Advantages

Covariance
Modified
Yule-Walker AR
AR Estimator Covariance
Estimator
AR Estimator
Does not apply Does not apply Applies window to
window to
window to data data
data
Minimizes the
forward
prediction
error in the
least squares
sense

Minimizes the
forward and
backward
prediction
errors in the
least squares
sense

Always produces a
stable model

Minimizes the
forward prediction
error in the least
squares sense (also
called
"autocorrelation
method")

Always produces a
stable model

Disadvantages

May produce
unstable
models

May produce
unstable
models

Performs relatively
poorly for short data
records

Conditions for
Nonsingularity

Order must be
less than or
equal to half
the input frame
size

Order must be
less than or
equal to 2/3 the
input frame size

Because of the
biased estimate, the
autocorrelation
matrix is guaranteed
to positive-definite,
hence nonsingular

You might also like