Professional Documents
Culture Documents
Random Process : is a mapping from an outcome ei of an Ep to a real - valued time function (i.e. waveform).
ei X (t , ⋅ )
⎯⎯ ⎯→ X (t, ei ) = xi (t ), a real - value waveform
e.g.,
X (t, e 1 ) = x1 (t )
X (t, ⋅) t
X (t, ⋅) X (t, e 2 ) = x 2 (t )
S e1
e2 t
e4 X (t, ⋅)
e3 X (t, e 3 ) = x 3 (t )
t
X (t, ⋅)
X (t, e 4 ) = x 4 (t )
x(t,e )
t
e1
en
e
x
θ + ω ct0
0
2 π − cos −1
(x A )
cos −1
(x A )
July 21, 2005 Prof.'s W. Zhuang & K. T. Wong 4
ktwong@ieee.org
Descriptio n of Random Processes
− analytical descriptio n :
X (t ) : {x (t , e ) e ∈ S } + probabilit y informatio n of S
− statistica l descriptio n :
Definition : A " complete" statistica l descriptio n of a random process X (t ) is : for any integer n
and any choice of sampling instants t1 , t 2 , , t n , the joint - PDF f X (t1 ), X (t 2 ), , X (t n ) (x1 , x 2 , , xn )
of ( X (t1 ), X (t 2 ), , X (t n )), is given.
Statistica l averages
x(t , e3 )
− mean or expectatio n :
m X (t )
m(t 2 ) = E ( X (t 2 ))
x(t , e1 )
m(t1 )
t
t1 t2
x(t , e2 )
At t = t0 , X ( t0 ) is defined by a PDF f X ( t0 ) ( x ) :
m X ( t0 ) = E ( X ( t0 ) ) =
∞
∫ x f X (t0 ) ( x ) dx
−∞
Example Find the mean of the random process X ( t ) = A cos (ωC t + Θ ) , where A is a
non-random scalar and Θ is a random variable uniformly distributed on [ 0,2π ] i.e.,
⎧⎪ 1 , 0 ≤ θ ≤ 2π
f Θ (θ ) = ⎨ 2π
⎪⎩0, otherwise
⇒ mX ( t ) = E ( X ( t ) ) = E ⎡⎣ A cos (ωC t + Θ ) ⎤⎦
2π
= ∫ A cos (ωC t + θ ) f Θ (θ ) dθ = 0.
0
Example
For a non-random A and a random Θ uniformly distributed over [ 0,2π ] ,
the autocorrelation function of the random process X ( t ) = A cos (ωC t + Θ ) is:
RX ( t1 , t2 ) = E ⎡⎣ A cos (ωC t1 + Θ ) A cos (ωC t2 + Θ ) ⎤⎦
⎧1 1 ⎫
= A2 E ⎨ cos ⎡⎣ωC ( t1 − t2 ) ⎤⎦ + cos ⎡⎣ωC ( t1 + t2 ) + 2Θ ⎤⎦ ⎬
⎩2 2 ⎭
A2 A2 2π 1 A2
= cos ⎡⎣ωC ( t1 − t2 ) ⎤⎦ + ∫ cos ⎡⎣ωC ( t1 + t2 ) + 2θ ⎤⎦ dθ = cos ⎡⎣ωC ( t1 − t2 ) ⎤⎦
2 2 0 2π 2
=0
Properties of R X (τ ) :
(1) R X (τ ) = R X (− τ ) if X (t ) is real - valued.
proof : R X (τ ) = R X (t , t − τ ) = E [ X (t )X (t − τ )] = E [X (t − τ )X (t )] = R X (− τ ).
( 2) R X (τ ) ≤ R X (0 ).
proof : [
E ( X (t ) ± X (t − τ ))
2
] ≥ 0 [ ]
⇒ E X 2 (t ) + X 2 (t − τ ) ± 2 X (t )X (t − τ ) ≥ 0
[ ] [ ]
⇒ E X 2 (t ) + E X 2 (t − τ ) ± 2 E [X (t )X (t − τ )] ≥ 0
⇒ R X (0 ) + R X (0 ) ± 2 R X (τ ) ≥ 0 ⇒ ∓ R X (τ ) ≤ R X (0 ) ⇒ R X (τ ) ≤ R X (0 )
1 T2
= lim ∫−T R X (0 )dt = R X (0 )
T −∞
T 2
T −∞
1 T
P = lim ∫ T2 [x(t )] dt = lim ∫ x(t ) rect t
T −
2
2
[
T −∞ T
1 ∞
−∞
( )] T
2
T −∞ T
[
1 ∞
−∞
( )]
dt = lim ∫ F x(t ) rect t
T
2
df
1 ∞ ∞ ∞
= lim ∫ X T ( f ) df = ∫ lim X T ( f ) df = ∫ S x ( f ) df
2 1 2
T −∞ T T −∞ T
−∞ −∞ −∞
⎧⎪1,
where rect ( ) t
T
= ⎨
−T ≤ t ≤ T
2 2.
⎪⎩0, otherwise
1 T
The identity lim ∫ T2 [x(t )] dt = lim ∫ X T ( f ) df is called the Parseval's theorem.
2 1 ∞ 2
T −∞ T T −∞ T
− −∞
2
T −∞ T
∞ ∞
F −1 {S x ( f )} = ∫ S x ( f ) e j 2π fτ df = ∫ lim X T ( f ) e j 2π fτ df
1 2
T −∞ T
−∞ −∞
*
1 ∞⎡ T ⎤⎡ T ⎤
= lim ∫ ⎢ ∫ T2 x(s ) e − j 2π fs ds ⎥ ⎢ ∫ T2 x(t ) e − j 2π f t dt ⎥ e j 2π f τ df
T −∞ T −∞ ⎣ − 2 ⎦⎣ − 2 ⎦
1 T T 1 T T
= lim ∫ T2 x(t )∫ T2 xs ⎡ ∫ e j 2π f (τ − s +t ) df ⎤ ds dt = lim ∫ T2 x(t )∫ T2 x(s )δ (τ − s + t ) ds dt
∞
T −∞ T − 2 −
2
⎢⎣ −∞ ⎥⎦ T −∞ T − 2 −
2
1 T
= lim ∫ T2 x(τ + t ) x(t ) dt , the autocorrelation function of x(t )
T −∞ T − 2
[ ]
T
If X (t ) is WSS, E X 2 (t ) = R X (0) = constant, then P = lim R X (0) dt = R X (0)
1
∫
2
T
T −∞ T −
2
⎡
The PSD of X (t ) is defined as S X ( f ) = E ⎢lim F X (t ) rect t
1
⎣ T −∞ T T
[ ( )] 2
⎤
⎥,
⎦
∞
Hence, P = ∫ S X ( f ) df = R X (0 )
−∞
⎡ T
⎤ T T
{S X ( f )} X (τ + t )X (t ) dt ⎥ = lim E [X (τ + t ) X (t )] dt R X (τ ) dt = R X (τ )
1 1 1
∫ ∫ ∫
−1
F = E ⎢lim T
2
T
2
= lim T
2
⎣ T −∞ T −
2 ⎦ T −∞ T −
2 T −∞ T −
2
⎧S ( f ) = ∞
R X (τ ) e − j 2π fτ dτ
⎪ X
Wiener - Khintchine Relations for WSS stochastic processes : ⎨
∫
−∞
∞
⎪ R X (τ ) = ∫ S X ( f ) e j 2π fτ df
⎩ −∞
⎧ y (t ) = x (t ) ⊗ h (t ) = ∞
Input-output Relationship:
⎪
⎨
∫ x ( t − τ ) h (τ ) dτ
−∞
⎪⎩Y ( f ) = X ( f ) H ( f )
∞
where H ( f ) = F ⎡⎣ h ( t ) ⎤⎦ = ∫ h ( t ) e − j 2π f t
dt
−∞
X (t ) h(t ) Y (t )
In the case of a stochastic input- signal:
= E ⎡⎣Y ( t ) ⎤⎦ = E ∫ X ( t − τ ) h (τ ) dτ ⎤ =
⎡ ∞ ∞
Mean: mY ( t ) ∫ E ⎡⎣ X ( t − τ ) ⎤⎦ h (τ ) dτ
⎢⎣ −∞ ⎥⎦ −∞
= E ⎡⎣ X ( t ) ⎤⎦ ⊗ h ( t ) = mX ( t ) ⊗ h ( t )
∞
If X ( t ) is WSS, then mY ( t ) = mX ∫ h (τ ) dτ = mX H ( 0 ) = constant
−∞
RY ( t , t + τ ) = E ⎡⎣Y ( t ) Y ( t + τ ) ⎤⎦ = E⎡
⎢⎣ (∫
−∞
∞
X (t − α ) h(α )dα )( ∫ ∞
−∞
X (t + τ − β ) h( β ) d β ⎤
⎥⎦ )
∞ ∞ ∞ ∞
= ∫ ∫ E [ X (t − α ) X (t + τ − β ) ] h(α ) h( β ) dα d β = ∫ ∫ RX ( t − α , t + τ − β ) h(α ) h( β ) dα d β
-∞ -∞ -∞ -∞
If X ( t ) is WSS, RX ( t − α , t + τ − β ) = RX (τ − β + α ) , then
RY ( t , t + τ ) =
∞
⎡ ∞ R (τ − β + α )h ( β ) d β ⎤ h(α ) dα ∞
⎡⎣ RX (τ + α ) ⊗ h (τ + α ) ⎤⎦ h(α ) dα
∫-∞ ⎣⎢ ∫-∞ X ⎦⎥
= ∫ -∞
∞
=⎡⎣ RX (τ − α ) ⊗ h (τ − α ) ⎤⎦ h(−α ) dα
∫ = RX (τ ) ⊗ h (τ ) ⊗ h ( −τ ) = RY (τ )
-∞
The PSD of Y (t ) is
SY ( f ) = F ⎡⎣ RY (τ ) ⎤⎦ = F ⎡⎣ RX (τ ) ⊗ h (τ ) ⊗ h ( −τ ) ⎤⎦ = F ⎡⎣ RX (τ ) ⎤⎦ F ⎡⎣ h (τ ) ⎤⎦ F ⎡⎣ h ( −τ ) ⎤⎦
Output −Signal's
PSD
SX ( f ) H ( f ) H * ( f ) = H ( f ) SX ( f )
2
=
Input −Signal's System
PSD Impulse − Response's
PSD
(3) S X ( f ) ≥ 0, ∀f
(4) S X ( f ) = S X (− f )
∞
Proof : S X (− f ) = ∫ R X (τ ) e − j 2π (− f )τ dτ
−∞
∞
∫ R (− τ )e
− j 2π f ( −τ )
= X dτ
−∞
−∞
∫ R (u )e (− du )
− j 2π f u
= X
u = -τ ∞
∞
= ∫ R X (u ) e − j 2π f u du = S X ( f )
−∞
(5) S Y ( f ) = H( f ) SX ( f )
2
Solution :
Step 1 : To show that Y (t ) is WSS :
mY (t ) = E [X (t ) cos(2π f C t + Θ )] = E [X (t )] E [cos(2π f C t + Θ )]
2π
m X ∫ cos(2π f C t + θ )
1
= dθ = 0
0 2π
R Y (t , t + τ ) = E [Y (t )Y (t + τ )] = E [ X (t ) cos(2πf C t + Θ ) X (t + τ ) cos(2πf C t + 2πf Cτ + Θ )]
2
At the output, the mean is mY (t ) = m N (t ) ⊗ h (t ) = 0
and the statistica lly averaged and time - averaged power is :
[ ]∞
E Y 2 (t ) = ∫ S Y ( f ) df =
−∞
N0 ∞
2 ∫− ∞
H ( f ) 2
df = σ Y
2
• Wiener-Khintchine Relation: RX (τ ) F
←⎯→ SX ( f )
F −1
⎡ 1 ⎡ ⎛t ⎞⎤
2
⎤
where S X ( f ) ∆ E ⎢ lim F ⎢ X ( t ) ⋅ rect ⎜ ⎟⎥ ⎥ ≥ 0
⎢⎣T →∞ T ⎣ ⎝T ⎠⎦ ⎦⎥
∞ ⎡ 1 T /2 ⎤
(3) PX = ∫ S X ( f ) df = E ⎢lim ∫ X 2 ( t ) dt ⎥ = RX ( 0 )
-∞
⎣ T →∞ T −T / 2
⎦