You are on page 1of 21

Reduksi Dimensi Image dengan

Principal Components
Analysis (PCA)
Sumber:
-Trucco & Verri chap. 10
-Standford Vision & Modeling

Contoh: problem Pattern Recognition

Page 1
1
Rotate coordinate system:

Problem Dimensi tinggi ??

Page 2
2
PCA (Principal Component Analysis)

• Untuk reduksi dimensi data (Dimensional


Reduction) !!!
• Ekstraksi struktur data dari dataset high
dimenson.
• Mencari basis signal berdasarkan data statistik
objek.

PCA

Page 3
3
PCA

Demo dengan Matlab:


• Mencari basis signal citra wajah.
• Image recognition, face recognition.

Page 4
4
PCA

Reduksi dimensi linear:

High-dimensional
Input Space

Page 5
5
Linear Subspace:

= +

= + 1.7

Linear Subspace:

Page 6
6
Principal Components Analysis:

N
sT2 = ∑ ( y[n ] − m ) 2
n =1

y = W~
x N
ST = ∑ ( ~
x − µ )( ~
x − µ )T
m n =1

sT2 = WSTW T

Contoh:

Data:

Kirby, Weisser, Dangelmayer 1993

Page 7
7
Contoh:

Data: New Basis Vectors

PCA

Contoh:

Data: EigenLips

PCA

Page 8
8
Contoh:

Face Recognition dengan Eigenfaces (Turk+Pentland, ):

Contoh:

Face Recognition System (Moghaddam+Pentland):

Page 9
9
Contoh: Visual Cortex

Hubel

Contoh: Visual Cortex

Hubel

Page 10
10
Contoh: Receptive Fields

Hubel

Contoh: Receptive Fields

Hancock et al: The principal components of natural images

Page 11
11
Contoh: Receptive Fields

Hancock et al: The principal components of natural images

Contoh:

Active Appearance Models (AAM): (Cootes et al)

Page 12
12
Contoh:

Active Appearance Models (AAM): (Cootes et al)

Contoh:

Active Appearance Models (AAM): (Cootes et al)

Page 13
13
Contoh:

3D Morphable Models
(Blanz+Vetter)

Ulasan

Constrain
-

V V
E(V)

Analytically derived: Learned:


Affine, Twist/Exponential Map Linear/non-linear
Sub-Spaces

Page 14
14
Non-Rigid Constrained Spaces

E(S) Constrain

S = (p1 ,…,p
n
)

Non-Rigid Constrained Spaces

Linear Subspaces:

• Small Basis Set

• Principal Components
Analysis

Nonlinear Manifolds:

Mixture Models

Page 15
15
Manifold Learning

EM

Training Data Mixture of Patches

Mixture of Projections

G2 Influence
G1 Function

P1 P2

Linear
Patch

∑ Gi( x) ⋅ Pi (x)
i
P (x ) = ---------------------------------------
∑ Gi (x)
i

Page 16
16
Contoh:

Eigen Tracking
(Black and Jepson)

Contoh:

Shape Models for tracking:

Page 17
17
Feature/Shape Models secara umum:

Visual Motion Contours:


Blake, Isard, Reynard

Feature/Shape Models secara umum:

Visual Motion Contours:


Blake, Isard, Reynard

Page 18
18
Linear Discriminant Analysis:

Fisher’s linear discriminant:

1
µk =
NK
∑x
n∈C K
n

S B = ( µ 2 − µ1 )( µ 2 − µ1 )T SW = ∑ (x
n∈C1
n − µ1 )( xn − µ1 )T + ∑ (x
n∈C 2
n − µ1 )( xn − µ1 )T

wT S BW
J= T
w SW w w ∝ SW−1 ( µ 2 − µ1 )

Page 19
19
Contoh: Eigenfaces vs Fisherfaces

Glasses or not Glasses ?

Contoh: Eigenfaces vs Fisherfaces

Input New Axis

Belhumeur, Hespanha, Kriegman 1997

Page 20
20
Basis Shape Algorithms lainnya:

• ICA (Independent Components Analysis, Bell+Sejnowski)


• Maximize Entropy (or spread of output distribution):

Basis Shape Algorithms lainnya:

• NMF (non-negative matrix factorization, Lee+Seung)


• LNMF (local NMF, Li et al)

Page 21
21

You might also like