Professional Documents
Culture Documents
G. D. Clifford 2005-2009
What is BSS?
Assume an observation (signal) is a linear mix of >1
unknown independent source signals
The mixing (not the signals) is stationary
We have as many observations as unknown sources
To find sources in observations
- need to define a suitable measure of independence
For example - the cocktail party problem (sources are speakers and background noise):
x1
z2
x2
A
xN
XT=AZT
zN
ZT
XT
YT ( N xM ) ZT
W ( N xN ) A-1
Signal source
Noise sources
Observed
mixtures
ZT
XT=AZT
YT=WXT
XT
ZT
YT
XT
Observation Sx(f )
x[n] - observation
Noise Power Sd(f )
BSS is a transform?
Eigenspectrum of decomposition
S = singular matrix zeros except on the leading diagonal
Sij (i=j) are the eigenvalues
Placed in order of descending magnitude
Correspond to the magnitude of projected data along each eigenvector
Eigenvectors are the axes of maximal variation in the data
[stem(diag(S).^2)]
Eigenspectrum=
Plot of eigenvalues
Variance = power
(analogous to Fourier
components in power spectra)
Y=USpVT
[Reduce the dimensionality of the data by discarding noise projections Snoise=0
Then reconstruct the data with just the signal subspace]
Real data
X
S2
n
Xp =USpVT
Xp p=2
Xp p=4
Gaussian PDF
SubGaussian
SuperGaussian
10
wij
Iterations
-2 -1]
Gradient descent
Gradient Descent
Cost function, , can be maximum or
minimum 1/
X1
and SNR=1
X2
Y1
Y2
Y1
Y2
Y1
Y2
Y1
Y2
Y1
Y2
Y1
Y2
Y1
Y2
Y1
Y2
Y1
Y2
Outlier insensitive
ICA cost functions
Mutual Information I,
Entropy (Negentropy, J ) and
Maximum (Log) Likelihood
(Note: all are related to )
where c is a constant.
differs from negentropy by a constant and a sign change
Observations
Separation of mixed observations into source estimates is excellent
apart from:
AB
=3 =11
=0 =3
=1 =5
=0 =1.5
Real data
X
Y=WX
Z
Xfilt =
Wp-1Y
ICA results 4
X
Y=WX
Z
Xfilt =
Wp-1Y
Summary
PCA is good for Gaussian noise separation
Fetal QRS
ECG Envelope
Movement Artifact
QRS Complex
P & T Waves
Muscle Noise
Baseline Wander
Fetal /
Maternal
Mixture
Maternal
Noise
Fetal
Appendix:
Outlier insensitive
ICA cost functions
Mutual Information I,
Entropy (Negentropy, J ) and
Maximum (Log) Likelihood
(Note: all are related to )
Appendix
Worked example (see lecture notes)
Worked example
Worked example
Worked example
Worked example