You are on page 1of 53

Stochastic processes

Lecture 6
Power spectral density (PSD)

Random process

1st order Distribution & density


function
First-order distribution

First-order density function

2end order Distribution & density


function
2end order distribution

2end order density function

EXPECTATIONS
Expected value

The autocorrelation

Some random processes

Single pulse
Multiple pulses
Periodic Random Processes
The Gaussian Process
The Poisson Process
Bernoulli and Binomial Processes
The Random Walk Wiener Processes
The Markov Process
6

Single pulse
Single pulse with random amplitude and arrival time:

Amplitude
Amplitude

Deterministic pulse:
S(t): Deterministic function.
Random variables:
A: gain a random variable
: arrival time.

Amplitude

X (t) = A S(t
)

Nerve spike

2
0
-2

0.5

1.5
t (ms)

2.5

0.5

1.5
t (ms)

2.5

0.5

1.5
t (ms)

2.5

2
0
-2

2
A and are statistically independent
0
-2

Multiple pulses
Single pulse with random amplitude and arrival time:
x

Deterministic pulse:
S(t): Deterministic function.
Random variables:
Ak: gain a random variable
k: arrival time.
n: number of pulses

Multiple Nerve spikes


2
0
-2

0.5

1.5

2.5

0.5

1.5

2.5

0.5

1.5

2.5

2
0
-2

2
0

Ak and k are statistically independent

-2

Periodic Random Processes


A process which is periodic with T

n is an integrer
Signal

2
1.5
1

T=100

X(t)

0.5
0
-0.5
-1
-1.5
-2

100

200

300

400

500
t

600

700

800

900

1000

The Gaussian Process


X(t1),X(t2),X(t3),.X(tn) are jointly
Gaussian fro all t and n values
Example: randn() in Matlab
Gaussian process

5
4

600

500

2
1

400

300

-1

200

-2

100

-3
-4

Histogram of Gaussian process

700

1000

2000

3000

4000

5000

6000

7000

8000

9000

10000

0
-4

-3

-2

-1

10

The Poisson Process


Typically used for modeling of
cumulative number of events over
time.
Example: counting the number of phone
call from a phone

11

Alternative definition
Poisson points
The number of events in an interval
N(t1,t2)

12

Bernoulli Processes
A process of zeros and ones
X=[0 0 1 1 0 1 0 0 1 1 1 0]
Each sample must be independent and
identically distributed Bernoulli variables.
The likelihood of 1 is defined by p
The likelihood of 0 is defined by q=1-p

13

Binomial process
Summed Bernoulli Processes

Where X[n] is a Bernoulli Processes

14

Random walk
For every T seconds take a step (size ) to
the left or right after tossing a fair coin
Random walks

6
4

x[n]

2
0
-2
-4
-6
-8
0

10

15

20

25
n

30

35

40

45

50

15

The Markov Process


1st order Markov process
The current sample is only depended on
the previous sample

Density function
Expected value
16

The frequency of earth quakes


Statement the number large earth
quakes has increased dramatically in
the last 10 year!

17

The frequency of earth quakes


Is the frequency of large earth
quakes unusual high?
Which random processes can we use
for modeling of the earth quakes
frequency?

18

The frequency of earth


quakes
Data
http://
earthquake.usgs.gov/earthquakes/eqa
rchives/year/graphs.php

19

Agenda (Lec 16)


Power spectral density
Definition and background
Wiener-Khinchin
Cross spectral densities
Practical implementations
Examples

20

Fourier transform recap 1


Transform between time and frequency
domain
Signal

4
3

Fourier transform

0.8
0.6

S(f)

s(t)

0.4

-1

Invers Fourier
transform

-2
-3

Fourier spectrum

200

400

600

800

1000

0.2
0
-15

-10

-5

0
f

10

21

15

Fourier transform recap 2


Assumption: The signal can be
reconstructed from sines and
cosinesfunctions.

e(jw n)

Real
Imaginary

Amplitude

0.5
0
-0.5

-1

20

40

60

Requirement: absolute
integrable

x[n]
n

80

100

22

Fourier transform of a stochastic


process
A stationary stochastic process is
typical not absolute integrable
There the signal is truncated
x(t)

5
0
-5

-T

Before Fourier transform

23

What is power?
In
the power spectrum density power
is related to electrical power

24

Power of a signal
The

power of a signal is calculated by


squaring the signal.

The average power in e period is :

25

Parseval's theorem
The power of the squared absolute
Fourier transform is equal the power
of the signal

26

Power of a stochastic
process
Thereby can the expected power can
be calculated from the Fourier
spectrum

27

Power spectrum density


Since the integral of the squared absolute Fourier
transform contains the full power of the signal it is
a density function.

So the power spectral density of a random


process is:

Due to absolute factor the PSD is always real


28

PSD Example
Signal

4
3

0.6
S(f)

1
0

0.4

-1

0.2

-2
0

200

400

600

800

0
-15

1000
PSD

-10

-5

0
f

10

15

|X
(f)
|2

0.8

Sxx(f)

s(t)

0.8

Fourier
transform

-3

Fourier spectrum

0.6
0.4
0.2
0
-15

-10

-5

10

15

29

Power spectrum density


The PSD is a density function.
In the case of the random process the PSD is
the density function of the random process and
not necessarily the frequency spectrum of a
single realization.

Example
A random process is defined as

Where r is a unifom distributed random


variable wiht a range from 0-
What is the PSD for the process and
The power sepctrum for a single realization

30

PSD of random process versus


spectrum of deterministic signals
In the case of the random process
the PSD is usual the expected value
E[Sxx(f)]
In the case of deterministic signals
the PSD is exact (There is still
estimation error)

31

Properties of the PSD


1.
Sxx(f) is real and nonnegative
2. The average power in X(t) is given by:

3. If X(t) is real Rxx() and Sxx(f) are also even

4. If X(t) has periodic components Sxx(f)has impulses


5. Independent on phase
32

Wiener-Khinchin 1
If the X(t) is stationary in the widesense the PSD is the Fourier
transform of the Autocorrelation

Proof: page
33 175

Wiener-Khinchin
Two method for estimation of the PSD
Fourier Transform

X(f)

|X(f)|2

X(t)
Sxx(f)

X(t)

Fourier Transform
t

Rxx()

Sxx(f)

Autocorrelation

34

The inverse Fourier Transform of the


PSD
Since the PSD is the Fourier
transformed autocorrelation

The inverse Fourier transform of the


PSD is the autocorrelation

35

Cross spectral densities


If X(t) and Y(t) are two jointly widesense stationary processes, is the
Cross spectral densities
Or

36

Properties of Cross spectral


densities
1. Since

is

2. Syx(f) is not necessary real


3. If X(t) and Y(t) are orthogonal
Sxy(f)=0

4. If X(t) and Y(t) are independent


Sxy(f)=E[X(t)] E[Y(t)] (f)
37

Cross spectral densities


example
1 Hz Sinus curves in white noise

Welch Cross Power Spectral Density Estimate

Where w(t) is Gaussian noise


5

0
-10

10

10
t (s)
Signal Y(t)

15

20

-5
-10
-15
-20
-25

0
-10

Power/frequency (dB/Hz)

X(t)

10

Y(t)

Signal X(t)

10
t (s)

15

20

-30

10
15
Frequency (Hz)

20

25

38

Implementations issues
The challenges includes
Finite signals
Discrete time

39

The periodogram
The estimate of the PSD
The PSD can be estimate from the
autocorrelation

Or directly from the signal

40

The discrete version of the


autocorrelation
Rxx()=E[X1(t) X(t+)]Rxx[m]
m= where m is an integer

N: number of samples
Normalized version:

41

Bias in the estimates of the


autocorrelation
N=12
M=-10
M=0
M=4

222

Autocorrelation
Autocorrelation

888

111

666

000
444

-1-1
-1
-2-2
-2
-10
-10
-10

-5-5
-5

000

555
nnn

10
10
10

15
15
15

20
20
20

222
000

222
111

-2
-2
-2

000
-4
-4
-4

-1-1
-1
-2-2
-2
-10
-10
-10

-5-5
-5

000

555
n+m
n+m
n+m

10
10
10

15
15
15

20
20
20

-6
-6
-6
-15
-15
-15

-10
-10
-10

-5
-5

00

55

10
10

15
15

42

Bias in the estimates of the


autocorrelation
The edge effect correspond to
multiplying the true autocorrelation
with a Bartlett window

w[m]

0.5

0
-15

-10

-5

0
m

10

15

43

Alternative estimation of
autocorrelation
The unbiased estimate

Unbiased

0.6

0.6

0.4

0.4

0.2

0.2

Rxx[m]

Rxx[m]

Biased

0
-0.2

0
-0.2

-0.4

-0.4

-0.6

-0.6

-15

-10

-5

0
m

10

15

-15

-10

-5

0
m

10

15

Disadvantage: high variance when |m|N


44

Influence at the power


spectrum
Biased version: a Bartlett window is
applied

Unbiased version: a Rectangular


window is applied

45

Example
Autocorrelation biased and unbiased
Unbiased
0.6

0.4

0.4

0.2

0.2

Rxx[m]

0.6

0
-0.2
-0.4
-0.6
-15

0
-0.2
-0.4
-0.6

-10

-5

0
m

10

15

-15

-10

-5

0
m

10

15

Estimated PSDsestimated PSD


5

Unbiased
Biased

Sxx()

Rxx[m]

Biased

3
2
1
0

46

Variance in the PSD


The variance of the periodogram is
estimated to the power of two of PSD
Realization 1

0
-5

True PSD

Sxx(f)

0.8

5
t (s)
Realization 2

10

-5

0.2

50

100
f (Hz)

150

200

50

100
150
f (Hz)
PSD: Realization 2

200

50

100
150
f (Hz)
PSD: Realization 3

200

50

200

5
0

5
t (s)
Realization 3

10

10

0
-5

10

0.4

0.6

PSD: Realization 1

10

5
0

5
t (s)

10

100
f (Hz)

150

47

Averaging
Divide the signal into K segments of
M length
Calculate the periodogram of each
segment
Calculate the average periodogram

48

Illustrations of Averaging
X(t)

2
0
-2
-4

10

10

100

200

100

200

100

10

6
4
2
0

100

200

200

10
5
0

50

100
f (Hz)

150

200

49

Effect of Averaging
The variance is decreased

But the spectral resolution is also


decreased

50

Additional options
The Welch method
Introduce overlap between segment
Where Q is the length between the
segments

Multiply the segment's with windows

51

Example
Heart rate variability
http://
circ.ahajournals.org/cgi/content/full/93/5/1
043#F3
High frequency component related to
Parasympathetic nervous system ("rest and
digest")
Low frequency component related to
sympathetic nervous system (fight-or-flight)
52

Agenda (Lec 16)


Power spectral density
Definition and background
Wiener-Khinchin
Cross spectral densities
Practical implementations
Examples

53

You might also like