You are on page 1of 16

Stochastic Processes & Stochastic Signal Processing

(Yates’ Chapters 6 & 10)

• Definition of a Stochastic Process

• Basic Properties of Stochastic Processes

• Filtering of Stochastic Processes

July 21, 2005 Prof.'s W. Zhuang & K. T. Wong 1


ktwong@ieee.org
Definition of a Stochastic (Random) Process

Random Process : is a mapping from an outcome ei of an Ep to a real - valued time function (i.e. waveform).
ei X (t , ⋅ )
⎯⎯ ⎯→ X (t, ei ) = xi (t ), a real - value waveform
e.g.,
X (t, e 1 ) = x1 (t )

X (t, ⋅) t

X (t, ⋅) X (t, e 2 ) = x 2 (t )
S e1
e2 t
e4 X (t, ⋅)
e3 X (t, e 3 ) = x 3 (t )
t
X (t, ⋅)
X (t, e 4 ) = x 4 (t )

S ⎯⎯ ⎯→ R 2 , a wo dimensional signal space ( x,t )


X (t , ⋅ )

July 21, 2005 Prof.'s W. Zhuang & K. T. Wong 2


ktwong@ieee.org
Interpretations :
A random process X (t, e ) is
− a family of deterministic functions, where t and e are variables, X (t,e ) = {x(t,ei ) ei ∈ S }
− a random variable at t = t 0 , X = X (t 0 ,e ) where e is variable depending on the outcome of a
particular trial.
− a single time function (or a sample of the given process) given that e is fixed, X (t,e ) = x(t )
where t is variable.
− a real number if both t and e are fixed.

x(t,e )

t
e1

en
e

X (t ) is commonly used as a shorthand to represent X (t, e )

July 21, 2005 Prof.'s W. Zhuang & K. T. Wong 3


ktwong@ieee.org
⎧A and ω c are deterministic scalar constants
Example : X (t ) = A cos(ω c t + Θ ), where ⎨
⎩Θ is a random variable, uniformly distributed on [0, 2π )

Θ = θ , a deterministic scalar constant ⇒ X (t ) = A cos(ω c t + θ ) = x(t ) is a deterministic waveform

t = t 0 ⇒ X (t 0 ) = A cos(ω c t 0 + Θ ) = X is a random scalar - variable

FX ( x ) = P( X ≤ x ) = P(a ≤ Θ ≤ b ), What are a and b?


If X = x, then A cos(ω c t 0 + θ ) = x, where θ = a or b
a = arccos x ( A) − ω tc 0 ∈ [0, 2π ], b = 2π − arccos x ( A) − ω t c 0 ∈ [0, 2π ]
FX ( x ) = ∫
a
b
f Θ (θ ) dθ =
1
2π π
( )
(b − a ) = 1 − 1 arccos x A , ∀x ∈ (− A, A)
d
f X (x ) = FX ( x ) , ∀x ∈ (− A,+ A)
1
=
dx (
π A −x
2 2 2
1
)
A cos (ω c t 0 + θ )

x
θ + ω ct0
0

2 π − cos −1
(x A )
cos −1
(x A )
July 21, 2005 Prof.'s W. Zhuang & K. T. Wong 4
ktwong@ieee.org
Descriptio n of Random Processes
− analytical descriptio n :
X (t ) : {x (t , e ) e ∈ S } + probabilit y informatio n of S
− statistica l descriptio n :
Definition : A " complete" statistica l descriptio n of a random process X (t ) is : for any integer n
and any choice of sampling instants t1 , t 2 , , t n , the joint - PDF f X (t1 ), X (t 2 ), , X (t n ) (x1 , x 2 , , xn )
of ( X (t1 ), X (t 2 ), , X (t n )), is given.

Statistica l averages
x(t , e3 )
− mean or expectatio n :
m X (t )

m(t 2 ) = E ( X (t 2 ))
x(t , e1 )
m(t1 )
t
t1 t2

x(t , e2 )

Figure. The mean of a random process.

July 21, 2005 Prof.'s W. Zhuang & K. T. Wong 5


ktwong@ieee.org
Definition: The mean (a.k.a., statistical expectation) of a random process X ( t ) is a deterministic
function of time mX ( t ) , that at each time instant t0 equals the mean of the random
variable X ( t0 ) . That is, mX ( t ) = E ( X ( t ) ) for all t.

At t = t0 , X ( t0 ) is defined by a PDF f X ( t0 ) ( x ) :

m X ( t0 ) = E ( X ( t0 ) ) =

∫ x f X (t0 ) ( x ) dx
−∞

Example Find the mean of the random process X ( t ) = A cos (ωC t + Θ ) , where A is a
non-random scalar and Θ is a random variable uniformly distributed on [ 0,2π ] i.e.,
⎧⎪ 1 , 0 ≤ θ ≤ 2π
f Θ (θ ) = ⎨ 2π
⎪⎩0, otherwise
⇒ mX ( t ) = E ( X ( t ) ) = E ⎡⎣ A cos (ωC t + Θ ) ⎤⎦

= ∫ A cos (ωC t + θ ) f Θ (θ ) dθ = 0.
0

July 21, 2005 Prof.'s W. Zhuang & K. T. Wong 6


ktwong@ieee.org
Correlation Function:
Definition: The autocorrelation function RX ( t1 , t2 ) of a random process X ( t ) :
RX ( t1 , t2 ) = E ⎡⎣ X ( t1 ) X ( t2 ) ⎤⎦ .
RX ( t1 , t2 ) provides a way of describing the inter-dependence of two random
variables, obtained by observing a random process X ( t ) at t1 and t2 . The more
rapidly each realization X ( t , e ) changes with time, the more rapidly RX ( t1 , t2 )
decreases from its maximum value RX ( t1 , t2 ) as t2 − t1 increases.

Example
For a non-random A and a random Θ uniformly distributed over [ 0,2π ] ,
the autocorrelation function of the random process X ( t ) = A cos (ωC t + Θ ) is:
RX ( t1 , t2 ) = E ⎡⎣ A cos (ωC t1 + Θ ) A cos (ωC t2 + Θ ) ⎤⎦
⎧1 1 ⎫
= A2 E ⎨ cos ⎡⎣ωC ( t1 − t2 ) ⎤⎦ + cos ⎡⎣ωC ( t1 + t2 ) + 2Θ ⎤⎦ ⎬
⎩2 2 ⎭
A2 A2 2π 1 A2
= cos ⎡⎣ωC ( t1 − t2 ) ⎤⎦ + ∫ cos ⎡⎣ωC ( t1 + t2 ) + 2θ ⎤⎦ dθ = cos ⎡⎣ωC ( t1 − t2 ) ⎤⎦
2 2 0 2π 2
=0

July 21, 2005 Prof.'s W. Zhuang & K. T. Wong 7


ktwong@ieee.org
Stationary Stochastic Processes
Definition : A random process X (t ) is wide - sense stationary (WSS) if m X (t ) = constant
and R X (t , t − τ ) = R X (τ ).
For example, the random process in the preceding examples is WSS. ECE 316 will be
concerned almost exclusively with WSS processes.

Properties of R X (τ ) :
(1) R X (τ ) = R X (− τ ) if X (t ) is real - valued.
proof : R X (τ ) = R X (t , t − τ ) = E [ X (t )X (t − τ )] = E [X (t − τ )X (t )] = R X (− τ ).

( 2) R X (τ ) ≤ R X (0 ).
proof : [
E ( X (t ) ± X (t − τ ))
2
] ≥ 0 [ ]
⇒ E X 2 (t ) + X 2 (t − τ ) ± 2 X (t )X (t − τ ) ≥ 0
[ ] [ ]
⇒ E X 2 (t ) + E X 2 (t − τ ) ± 2 E [X (t )X (t − τ )] ≥ 0
⇒ R X (0 ) + R X (0 ) ± 2 R X (τ ) ≥ 0 ⇒ ∓ R X (τ ) ≤ R X (0 ) ⇒ R X (τ ) ≤ R X (0 )

(3) R X (0 ) is the average power of X (t )


⎡ 1 T ⎤ 1 T
proof : The average power of X (t ) is P = E ⎢lim ∫ T2 X 2 (t )dt ⎥ = lim ∫ T2 E X 2 (t ) dt [ ]
⎣ T −∞ T − 2 ⎦ T −∞ T

2

1 T2
= lim ∫−T R X (0 )dt = R X (0 )
T −∞
T 2

July 21, 2005 Prof.'s W. Zhuang & K. T. Wong 8


ktwong@ieee.org
Power & Power Spectral Density (PSD) of a Deterministic Signal x(t )

The time - average power of x(t ) is

T −∞
1 T
P = lim ∫ T2 [x(t )] dt = lim ∫ x(t ) rect t
T −
2
2
[
T −∞ T
1 ∞
−∞
( )] T
2

T −∞ T
[
1 ∞
−∞
( )]
dt = lim ∫ F x(t ) rect t
T
2
df

1 ∞ ∞ ∞
= lim ∫ X T ( f ) df = ∫ lim X T ( f ) df = ∫ S x ( f ) df
2 1 2

T −∞ T T −∞ T
−∞ −∞ −∞

⎧⎪1,
where rect ( ) t
T
= ⎨
−T ≤ t ≤ T
2 2.
⎪⎩0, otherwise
1 T
The identity lim ∫ T2 [x(t )] dt = lim ∫ X T ( f ) df is called the Parseval's theorem.
2 1 ∞ 2

T −∞ T T −∞ T
− −∞
2

S x ( f ) = lim X T ( f ) is called the PSD of x(t ) .


1 2

T −∞ T
∞ ∞
F −1 {S x ( f )} = ∫ S x ( f ) e j 2π fτ df = ∫ lim X T ( f ) e j 2π fτ df
1 2

T −∞ T
−∞ −∞

*
1 ∞⎡ T ⎤⎡ T ⎤
= lim ∫ ⎢ ∫ T2 x(s ) e − j 2π fs ds ⎥ ⎢ ∫ T2 x(t ) e − j 2π f t dt ⎥ e j 2π f τ df
T −∞ T −∞ ⎣ − 2 ⎦⎣ − 2 ⎦
1 T T 1 T T
= lim ∫ T2 x(t )∫ T2 xs ⎡ ∫ e j 2π f (τ − s +t ) df ⎤ ds dt = lim ∫ T2 x(t )∫ T2 x(s )δ (τ − s + t ) ds dt

T −∞ T − 2 −
2
⎢⎣ −∞ ⎥⎦ T −∞ T − 2 −
2

1 T
= lim ∫ T2 x(τ + t ) x(t ) dt , the autocorrelation function of x(t )
T −∞ T − 2

July 21, 2005 Prof.'s W. Zhuang & K. T. Wong 9


ktwong@ieee.org
Power and PSD of a Random Signal X (t )
1 T2 2
For a random signal X (t ), its time - averaged power lim ∫ T X (t ) dt is a random variable.
T −∞ T − 2

The statistically - averaged and time - averaged power of X (t ) is defined by


⎡ ⎤
[ ]
T T
X 2 (t ) dt ⎥ = lim E X 2 (t ) dt
1 1
P = E ⎢lim ∫ ∫
2 2
−T −T
⎣ T −∞ T 2 ⎦ T −∞ T 2

[ ]
T
If X (t ) is WSS, E X 2 (t ) = R X (0) = constant, then P = lim R X (0) dt = R X (0)
1

2
T
T −∞ T −
2


The PSD of X (t ) is defined as S X ( f ) = E ⎢lim F X (t ) rect t
1
⎣ T −∞ T T
[ ( )] 2

⎥,


Hence, P = ∫ S X ( f ) df = R X (0 )
−∞

⎡ T
⎤ T T
{S X ( f )} X (τ + t )X (t ) dt ⎥ = lim E [X (τ + t ) X (t )] dt R X (τ ) dt = R X (τ )
1 1 1
∫ ∫ ∫
−1
F = E ⎢lim T
2
T
2
= lim T
2

⎣ T −∞ T −
2 ⎦ T −∞ T −
2 T −∞ T −
2

⎧S ( f ) = ∞
R X (τ ) e − j 2π fτ dτ
⎪ X
Wiener - Khintchine Relations for WSS stochastic processes : ⎨

−∞

⎪ R X (τ ) = ∫ S X ( f ) e j 2π fτ df
⎩ −∞

July 21, 2005 Prof.'s W. Zhuang & K. T. Wong 10


ktwong@ieee.org
Filtering of a Stochastic Signal through a Linear Time-Invariant (LTI) System

In the case of a non-stochastic input- signal: x(t ) h(t ) y (t )

⎧ y (t ) = x (t ) ⊗ h (t ) = ∞

Input-output Relationship:


∫ x ( t − τ ) h (τ ) dτ
−∞

⎪⎩Y ( f ) = X ( f ) H ( f )

where H ( f ) = F ⎡⎣ h ( t ) ⎤⎦ = ∫ h ( t ) e − j 2π f t
dt
−∞

X (t ) h(t ) Y (t )
In the case of a stochastic input- signal:

= E ⎡⎣Y ( t ) ⎤⎦ = E ∫ X ( t − τ ) h (τ ) dτ ⎤ =
⎡ ∞ ∞
Mean: mY ( t ) ∫ E ⎡⎣ X ( t − τ ) ⎤⎦ h (τ ) dτ
⎢⎣ −∞ ⎥⎦ −∞

= E ⎡⎣ X ( t ) ⎤⎦ ⊗ h ( t ) = mX ( t ) ⊗ h ( t )

If X ( t ) is WSS, then mY ( t ) = mX ∫ h (τ ) dτ = mX H ( 0 ) = constant
−∞

July 21, 2005 Prof.'s W. Zhuang & K. T. Wong 11


ktwong@ieee.org
Time-Domain Correlation Function:

RY ( t , t + τ ) = E ⎡⎣Y ( t ) Y ( t + τ ) ⎤⎦ = E⎡
⎢⎣ (∫
−∞

X (t − α ) h(α )dα )( ∫ ∞

−∞
X (t + τ − β ) h( β ) d β ⎤
⎥⎦ )
∞ ∞ ∞ ∞
= ∫ ∫ E [ X (t − α ) X (t + τ − β ) ] h(α ) h( β ) dα d β = ∫ ∫ RX ( t − α , t + τ − β ) h(α ) h( β ) dα d β
-∞ -∞ -∞ -∞

If X ( t ) is WSS, RX ( t − α , t + τ − β ) = RX (τ − β + α ) , then

RY ( t , t + τ ) =

⎡ ∞ R (τ − β + α )h ( β ) d β ⎤ h(α ) dα ∞
⎡⎣ RX (τ + α ) ⊗ h (τ + α ) ⎤⎦ h(α ) dα
∫-∞ ⎣⎢ ∫-∞ X ⎦⎥
= ∫ -∞


=⎡⎣ RX (τ − α ) ⊗ h (τ − α ) ⎤⎦ h(−α ) dα
∫ = RX (τ ) ⊗ h (τ ) ⊗ h ( −τ ) = RY (τ )
-∞

Hence, if X (t ) is WSS, then Y (t ) is also WSS.

The PSD of Y (t ) is
SY ( f ) = F ⎡⎣ RY (τ ) ⎤⎦ = F ⎡⎣ RX (τ ) ⊗ h (τ ) ⊗ h ( −τ ) ⎤⎦ = F ⎡⎣ RX (τ ) ⎤⎦ F ⎡⎣ h (τ ) ⎤⎦ F ⎡⎣ h ( −τ ) ⎤⎦
Output −Signal's
PSD

SX ( f ) H ( f ) H * ( f ) = H ( f ) SX ( f )
2
=
Input −Signal's System
PSD Impulse − Response's
PSD

The statistically-averaged and time-averaged power of Y ( t ) is:



= RY ( 0 ) = ∫ H ( f ) S X ( f ) df
2
PY
−∞

July 21, 2005 Prof.'s W. Zhuang & K. T. Wong 12


ktwong@ieee.org
Properties of the PSD

(1) PX = R X (0 ) = ∫ S X ( f ) df
−∞

(2) S X (0 ) = ∫ R X (τ ) dτ
−∞

(3) S X ( f ) ≥ 0, ∀f
(4) S X ( f ) = S X (− f )

Proof : S X (− f ) = ∫ R X (τ ) e − j 2π (− f )τ dτ
−∞

∫ R (− τ )e
− j 2π f ( −τ )
= X dτ
−∞
−∞
∫ R (u )e (− du )
− j 2π f u
= X
u = -τ ∞

= ∫ R X (u ) e − j 2π f u du = S X ( f )
−∞

(5) S Y ( f ) = H( f ) SX ( f )
2

July 21, 2005 Prof.'s W. Zhuang & K. T. Wong 13


ktwong@ieee.org
Example : Derive the PSD of the random process Y (t ) = X (t ) cos(2π f C t + Θ ), where X (t ) is a WSS
stochastic process with PSD S X ( f ), f c is a deterministic scalar constant, Θ is a random variable
uniformly distributed on [0, 2π ] and is statistically independent of X (t )

Solution :
Step 1 : To show that Y (t ) is WSS :
mY (t ) = E [X (t ) cos(2π f C t + Θ )] = E [X (t )] E [cos(2π f C t + Θ )]

m X ∫ cos(2π f C t + θ )
1
= dθ = 0
0 2π
R Y (t , t + τ ) = E [Y (t )Y (t + τ )] = E [ X (t ) cos(2πf C t + Θ ) X (t + τ ) cos(2πf C t + 2πf Cτ + Θ )]

= E [ X (t )X (t + τ )] E [cos(2πf C t + Θ ) cos(2πf C t + 2πf Cτ + Θ )] = R X (τ ) cos(2πf Cτ )


1
2
= R Y (τ ) ⇒ Y (t ) is WSS.
Step 2 : To use the Wiener - Khintchine relation to derive the PSD of Y (t ) :
⎡1 ⎤ ⎡1 ⎤
S Y ( f ) = F [R Y (τ )] = F ⎢ R X (τ ) cos(2πf Cτ )⎥ = F ⎢ R X (τ ) exp( j 2πf Cτ ) + R X (τ ) exp(− j 2πf Cτ )⎥
1
⎣2 ⎦ ⎣4 4 ⎦
=
1
[S X ( f − f C ) + S X ( f + f C )]
4

July 21, 2005 Prof.'s W. Zhuang & K. T. Wong 14


ktwong@ieee.org
Example A random process N (t ) is " zero - mean whit e Gaussian noise" if
(1) N (t ) is a Gaussian random variable at all t ,
(2) E [N (t )] = 0 at all t , and
N0
(3) R N (τ ) = δ (τ )
2
N(t) Y(t)
These imply :
h(t) H(f)
i ) E [N (t1 )N (t 2 )] = 0 if t1 ≠ t 2 ;
ii) [
E N 2 (t ) ] = R N (0 ) = ∞ , i.e., N (t ) is physically unrealizab le!
iii) S N ( f ) = F [R N (τ )] = N 0 / 2, − ∞ < f < ∞ ( two - sided psd)
N0
iv ) S Y ( f ) = H ( f ) , wher e H (t ) is usually realizable .
H ( f ) SN ( f ) =
2 2

2
At the output, the mean is mY (t ) = m N (t ) ⊗ h (t ) = 0
and the statistica lly averaged and time - averaged power is :

[ ]∞
E Y 2 (t ) = ∫ S Y ( f ) df =
−∞
N0 ∞
2 ∫− ∞
H ( f ) 2
df = σ Y
2

If the input to a linear ti me - invariant filter is Gaussian, then th e


output is also Gaussian; hence, at any time instant t 0 , Y (t 0 ) ~ N 0 ,σ Y ( 2
)
July 21, 2005 Prof.'s W. Zhuang & K. T. Wong 15
ktwong@ieee.org
Summary
⎧⎪mean: mX ( t ) = E ⎡⎣ X ( t ) ⎤⎦
(1) Definition and description of a random process ⎨
⎪⎩correlation: RX ( t1 , t2 ) = E ⎡⎣ X ( t1 ) X ( t2 ) ⎤⎦

(2) WSS random processes:


• mX ( t ) = constant
• RX ( t , t + τ ) = RX (τ ) → RX (τ ) = RX ( −τ ) , RX ( 0 ) ≥ RX (τ )

• Wiener-Khintchine Relation: RX (τ ) F
←⎯→ SX ( f )
F −1
⎡ 1 ⎡ ⎛t ⎞⎤
2

where S X ( f ) ∆ E ⎢ lim F ⎢ X ( t ) ⋅ rect ⎜ ⎟⎥ ⎥ ≥ 0
⎢⎣T →∞ T ⎣ ⎝T ⎠⎦ ⎦⎥

∞ ⎡ 1 T /2 ⎤
(3) PX = ∫ S X ( f ) df = E ⎢lim ∫ X 2 ( t ) dt ⎥ = RX ( 0 )
-∞
⎣ T →∞ T −T / 2

(4) mY ( t ) = mX ( t ) ⊗ h(t ) = mX ( t ) H (0) = constant


X(t) LTI Y(t)
RY (τ ) = RX (τ ) ⊗ h(τ ) ⊗ h(−τ )
h(t)
SY ( f ) = SX ( f ) H ( f )
2
WSS WSS

= ∫ S X ( f ) H ( f ) df
2
PY
−∞

July 21, 2005 Prof.'s W. Zhuang & K. T. Wong 16


ktwong@ieee.org

You might also like