21 views

Uploaded by Qa Sim

neural net with eigen images

- ddagarc.pdf
- ilin10a
- Kernel PCA
- AM2007 paper
- A Novel Technique for Human Face Recognition Using Nonlinear Curvelet Feature Subspace
- Framework SIMs ACC
- Multi Sensor Eigenspaces
- 94310-6767 IJET-IJENS
- assessing social anxiety.rtf
- 1106.3466.pdf
- Jurnal Tugas Jst
- Ann Bot-2005-KAKANI-59-67
- L008.Eigenfaces and NN SOM
- Case Study: Making Sense of Diesel Prices
- public spaces
- Hand Written Digit Recognition
- SSRN-id2876502
- An Introduction to QSAR Methodology
- Dynamics.docx
- Lecture 8

You are on page 1of 29

n Unitary transforms

n Karhunen-Love transform and eigenimages

n Sirovich and Kirby method

n Eigenfaces for gender recognition

n Fisher linear discrimant analysis

n Fisherimages and varying illumination

n Fisherfaces vs. eigenfaces

Unitary transforms

!

n Sort pixels f [x,y] of an image into column vector f of length N

n Calculate N transform coefficients

c = Af

where A is a matrix of size NxN

n The transform A is unitary, iff

A 1 = *T

A

A

H

Hermitian conjugate

Energy conservation with unitary transforms

n For any unitary transform c = Af we obtain

2 H H H 2

c = c c = f A Af = f

coordinate system (and, possibly, sign flips)

n Vector length is conserved.

n Energy (mean squared vector length) is conserved.

Energy distribution for unitary transforms

n Energy is conserved, but, in general, unevenly distributed among coefficients.

n Autocorrelation matrix

H H H

Rcc = E cc = E Af f A = AR ff A H

i,i

Eigenmatrix of the autocorrelation matrix

Definition: eigenmatrix of autocorrelation matrix Rff

l is unitary

l The columns of form a set of eigenvectors of Rff, i.e.,

$ 0 '

0

& )

& 1 )

=& )

& ! )

&% 0 N 1 )(

Karhunen-Love transform

n Unitary transform with matrix

A= H

Rcc = AR ff A = R ff = =

H H H

n Energy concentration property:

l No other unitary transform packs as much energy into the first J coefficients.

l Mean squared approximation error by keeping only first J coefficients is minimized.

l Holds for any J.

Illustration of energy concentration

f2 cos sin c2

A=

sin cos

After KLT:

Strongly correlated

uncorrelated samples,

samples,

equal energies f1 c1 most of the energy in

first coefficient

Basis images and eigenimages

n For any transform, the inverse transform

! 1 !

f =A c

can be interpreted in terms of the superposition of columns of A-1 (basis images)

n For the KL transform, the basis images are the eigenvectors of the

autocorrelation matrix Rff and are called eigenimages.

n If energy concentration works well, only a limited number of eigenimages is

needed to approximate a set of images with small error. These eigenimages

span an optimal linear subspace of dimensionality J.

Eigenimages for recognition

n To recognize complex patterns (e.g., faces), large portions of an image have to

be considered

n High dimensionality of image space means high computational burden for

many recognition techniques

Example: nearest-neigbor search requires pairwise comparison with every image in a database

n Transform c = Wf

can reduce dimensionality from N to J by representing the

image by J coefficients

n Idea: tailor a KLT to a specific set of training images

representative of the recognition task

to preserve the salient features

Eigenimages for recognition

f c

W

f

+

f -

Rejection

Normalization Projection Similarity

1

New Face measure

!T !

Image (e.g., c pk* )

f

p1

Class of most k*

similar pk

!

pK Recognition

Mean Face Result

Database of Similarity

Eigenface Matching

Coefficients

Computing eigenimages from a training set

n How to obtain NxN covariance matrix?

! ! !

,

l Use training set 1 2 ,, L+1

(each column vector represents one image)

l Let be the mean image of all L+1 training images

! "! ! "! ! "! ! "!

1 2 (

l Define training set matrix S = , , ,, ,

3 L )

L ! "! ! "! H

and calculate scatter matrix R =

l=1

( )(

l l ) = SS H

If L < N, scatter matrix R is rank-deficient

Problem 2: Finding eigenvectors of an NxN matrix.

from a small training set L << N ?

Sirovich and Kirby algorithm

n Instead of eigenvectors of SS H , consider the eigenvectors of S H S, i.e.,

H

S Svi = i vi

n Premultiply both sides by S

! !

=

SS H Svi = i Svi

n By inspection, we find that Svi are eigenvectors of SS H

l Compute the LxL matrix SHS

l Compute L eigenvectors vi of SHS

l Compute eigenimages corresponding to the L0 L largest eigenvalues

as a linear combination of training images Svi

L. Sirovich and M. Kirby, "Low-dimensional procedure for the characterization of human faces,"

Journal of the Optical Society of America A, 4(3), pp. 519-524, 1987.

Example: eigenfaces

n The first 8 eigenfaces obtained from a training set of 100 male and 100 female

training images

Mean Face

n Can be used for face recognition by nearest-neighbor search in 8-d face space.

Gender recognition using eigenfaces

Nearest neighbor search face space

Fisher linear discriminant analysis

n Eigenimage method maximizes scatter within the linear subspace over the

entire image set regardless of classification task

W

( ( ))

n Fisher linear discriminant analysis (1936): maximize between-class scatter, while

minimizing within-class scatter

!"

! !" !"! !"

( )( )

c H

RB = N i i i

( )

i=1

det WR W H

B

Wopt = arg max

( )

Samples

Mean in class i

W det WRW W H in class i

( )( )

c H

RW = l i l i

i=1 Class(i)

l

Fisher linear discriminant analysis (cont.)

!"

!

n Solution: Generalized eigenvectors wi corresponding to the

J largest eigenvalues {i | i = 1,2,..., J }, i.e.

!"

! !"

!

RB wi = i RW wi , i = 1,2,..., J

!"

! !"

!

n solve eigen-problem on this: R 1

W (

RB wi = i wi , i = 1,2,..., J )

n Problem: within-class scatter matrix Rw at most of rank L-1 (for L images total

in all classes combined), hence usually singular.

n Apply KLT first to reduce dimensionality of feature space to L-1 (or less),

proceed with Fisher LDA in lower-dimensional space

Eigenimages vs. Fisherimages

2-d example: f2

a 1-d subspace, then perform

classification.

f1

Eigenimages vs. Fisherimages

2-d example: f2 KLT

a 1-d subspace, then perform

classification.

maximum energy, but

f1

the 2 classes are no

longer distinguishable.

Eigenimages vs. Fisherimages

2-d example: f2 KLT

a 1-d subspace, then perform

classification.

maximum energy, but

f1

the 2 classes are no

longer distinguishable.

Fisher LDA separates the

classes by choosing

a better 1-d subspace.

Fisher LDA

Fisherimages and varying iIllumination

Differences due to varying illumination can be

much larger than differences among faces!

Fisherimages and varying iIllumination

n All images of same Lambertian surface with different

illumination (without shadows) lie in a 3d linear subspace

n Single point source at infinity Light source

intensity

surface !

normal n ! !T !

l

light source

( )

f ( x, y ) = a ( x, y ) l n ( x, y ) L

direction

Surface

albedo

still in same 3d linear subspace, due to linear superposition

of each contribution to image

n Fisherimages can eliminate within-class scatter

Side Note: Photometric Stereo!

observed normalized

intensity! lighting direction!

N

L I = L N

albedo! normalized

(constant)! surface normal!

[Woodham 1980]!

Side Note: Photometric Stereo!

N Nx

L

I = Lx Ly Lz N y

N

z

[Woodham 1980]!

Side Note: Photometric Stereo!

L( 2 ) L(1) N

N

I (1) L (1)

L(1)

L(1) L ( 3) x y z

x

(2) L(2) Ny

I = L(2) L(2)

x y z

I (3)

Lx

(3)

L (3)

L(3)

N z

y z

=! I =! L

1

assume albedo is constant, invert matrix! N=L I [Woodham 1980]!

input! Side Note: Photometric Stereo!

[Woodham 1980]!

Fisherface trained to recognize gender

! ! !

1 2

Female face samples

Fisherface

Gender recognition using 1st Fisherface

Gender recognition using 1st eigenface

Person identification with Fisherfaces and eigenfaces

40 classes

10 images per class

- ddagarc.pdfUploaded byArvind Adimoolam
- ilin10aUploaded byc0ldlimit8345
- Kernel PCAUploaded byShiv Ganesh
- AM2007 paperUploaded byAlexander Aidan
- A Novel Technique for Human Face Recognition Using Nonlinear Curvelet Feature SubspaceUploaded byBilal Saeed
- Framework SIMs ACCUploaded byShi_Ruijie
- Multi Sensor EigenspacesUploaded byHaveit12
- 94310-6767 IJET-IJENSUploaded byashikhmd4467
- assessing social anxiety.rtfUploaded byO'iz CyborghCorrientes
- 1106.3466.pdfUploaded byLarbi Nouar
- Jurnal Tugas JstUploaded byAmsal Maestro
- Ann Bot-2005-KAKANI-59-67Uploaded byTahir Rizvi
- L008.Eigenfaces and NN SOMUploaded byVu Hung Cuong
- Case Study: Making Sense of Diesel PricesUploaded bySpider Financial
- public spacesUploaded byDisha kothari
- Hand Written Digit RecognitionUploaded byasadbloch78
- SSRN-id2876502Uploaded byShyamsunder Singh
- An Introduction to QSAR MethodologyUploaded byparult
- Dynamics.docxUploaded bySimranAhluwalia
- Lecture 8Uploaded byarvindtopno
- Newsweek's Top High Schools MethodologyUploaded byNewsweek
- KLnotes-1Uploaded byJordan Rowe
- Direction of Arrival EstimationUploaded byDileepKMaheshwari
- Theory of Resistor NetworksUploaded bybitconcepts9781
- Patent WO2007047299A1 - Classification of Fabrics by Near-Infrared Spectroscopy - Google PatentsUploaded byEdwin Kamalha
- Combination of Mid- And Near-Infrared Spectroscopy for the Determination of the Quality Properties of BeersUploaded byPax Nguyen
- Introduction to QSAR MethodologyUploaded byajay3333
- Ahp Different Priority MethodUploaded byAmit Kapoor
- 1-s2.0-S0031942213001222-mainUploaded bylola
- ODEs and Difference Equations ExcercisesUploaded byFilip Radu

- HousePricePrediction PosterUploaded byQa Sim
- StanfordUploaded byQa Sim
- Developing Data Products Course NotesUploaded bySam-Bah Chi
- Wcsd2013 Yan Et AlUploaded byQa Sim
- calcgpaUploaded bynjansowjet
- Developing Data Products Course NotesUploaded byQa Sim
- WAPDA PakistanUploaded byQa Sim
- CH7m.pdfUploaded byQa Sim
- Advt_No_6_2016Uploaded byWaqas Ayub
- 3_expon_form.pdfUploaded byQa Sim
- Character Certificate Format by Gazetted OfficerUploaded bypratik_raj0810
- Integral inequalities Constantin P. NiculescuUploaded byRamya Datta

- Natural Resources Full Text Cases 2Uploaded byShelumiel Ryan Abapo
- VastuUploaded byGanji Nagendar
- Graph Theory in Distribution and Transportation ProblemsUploaded bysandi nurmansyah
- Tragic_Plot_Structure_with_notes.pdfUploaded byblue blue
- Grading Rubric Lilavati Math Design Cycle.docUploaded byHillary Daniels
- The Identification of Sub Centres in Two Italian Metropolitan Areas, A Functional Approach, Rome, MilanUploaded byJohn Deepthi
- The Chaldean Account Of GenesisUploaded byWealthEntrepreneur
- A Proposal in Strengthening 5 Elemen of Corporate CultureUploaded byAndi M Raihan Rasyidi Rahmat
- The Taiji Manual of Li XianwuUploaded byHarris Klonizakis
- The Army Crew CaseUploaded byHitesh Takhtani
- Lino Miele: Ashtanga YogaUploaded bygaurangad
- Numerical LiteracyUploaded byXsey Doh Tukarnamo
- Enzymes LabpaqUploaded byNurseStudent
- A Principle Refers to a Fundamental TruthUploaded byPrecious Alpha
- Chaucer 1 - notesUploaded byLena
- Social Theories of Literary Criticism (1)Uploaded byMuhammad Khalid Ali
- cyclone-group-kita.docxUploaded byZen Alkaff
- Shays' Rebellion EssayUploaded byClara Kiss
- StratUploaded byRoyal Settu
- conceptmap momi elibUploaded byapi-264382729
- Modal MetaphysicsUploaded byMitko
- Work Measurments-part1( Introduction)Uploaded byAmro Goneim
- scope and sequence of five lessons finalUploaded byapi-260503527
- Theology ProperUploaded byEl Aries Somar
- Spitz phDUploaded byArkadiusz Baksik
- Supreme Court Nine Judge Constitution Bench Judgement on Right to PrivacyUploaded byDivyansh Hanu
- Automating Perverse PossibilitiesUploaded bysoydepalo
- T3Uploaded byYashdeep Umesh
- Rotational InertiaUploaded byquytruyenkiep
- 8 Stages of Developmental LearningUploaded byLeo Simon Anfone