Вы находитесь на странице: 1из 29

Eigenimages

n Unitary transforms
n Karhunen-Love transform and eigenimages
n Sirovich and Kirby method
n Eigenfaces for gender recognition
n Fisher linear discrimant analysis
n Fisherimages and varying illumination
n Fisherfaces vs. eigenfaces

Digital Image Processing: Bernd Girod, 2013 Stanford University -- Eigenimages 1


Unitary transforms
!
n Sort pixels f [x,y] of an image into column vector f of length N
n Calculate N transform coefficients

c = Af
where A is a matrix of size NxN
n The transform A is unitary, iff

A 1 = *T
A
A

H

Hermitian conjugate

n If A is real-valued, i.e., A=A*, transform is orthonormal

Digital Image Processing: Bernd Girod, 2013 Stanford University -- Eigenimages 2


Energy conservation with unitary transforms

n For any unitary transform c = Af we obtain

2 H H H 2
c = c c = f A Af = f

n Interpretation: every unitary transform is simply a rotation of the


coordinate system (and, possibly, sign flips)
n Vector length is conserved.
n Energy (mean squared vector length) is conserved.

Digital Image Processing: Bernd Girod, 2013 Stanford University -- Eigenimages 3


Energy distribution for unitary transforms
n Energy is conserved, but, in general, unevenly distributed among coefficients.
n Autocorrelation matrix
H H H
Rcc = E cc = E Af f A = AR ff A H

n Diagonal of Rcc comprises mean squared values (energies) of the coefficients ci

E ci2 = Rcc i,i = AR ff A H


i,i

n for now: assume Rff is known or can be computed

Digital Image Processing: Bernd Girod, 2013 Stanford University -- Eigenimages 4


Eigenmatrix of the autocorrelation matrix
Definition: eigenmatrix of autocorrelation matrix Rff
l is unitary
l The columns of form a set of eigenvectors of Rff, i.e.,

R ff = is a diagonal matrix of eigenvalues i


$ 0 '
0
& )
& 1 )
=& )
& ! )
&% 0 N 1 )(

l unitary eigenmatrix for auto-correlation matrix always exists

l Rff is symmetric positive (semi-)definite, hence i 0 for all i

Digital Image Processing: Bernd Girod, 2013 Stanford University -- Eigenimages 5


Karhunen-Love transform
n Unitary transform with matrix

A= H

n Transform coefficients are pairwise uncorrelated

Rcc = AR ff A = R ff = =
H H H

n Columns of are ordered according to decreasing eigenvalues.


n Energy concentration property:
l No other unitary transform packs as much energy into the first J coefficients.
l Mean squared approximation error by keeping only first J coefficients is minimized.
l Holds for any J.

Digital Image Processing: Bernd Girod, 2013 Stanford University -- Eigenimages 6


Illustration of energy concentration

f2 cos sin c2
A=
sin cos
After KLT:
Strongly correlated
uncorrelated samples,
samples,
equal energies f1 c1 most of the energy in
first coefficient

Digital Image Processing: Bernd Girod, 2013 Stanford University -- Eigenimages 7


Basis images and eigenimages
n For any transform, the inverse transform
! 1 !
f =A c
can be interpreted in terms of the superposition of columns of A-1 (basis images)
n For the KL transform, the basis images are the eigenvectors of the
autocorrelation matrix Rff and are called eigenimages.
n If energy concentration works well, only a limited number of eigenimages is
needed to approximate a set of images with small error. These eigenimages
span an optimal linear subspace of dimensionality J.

Digital Image Processing: Bernd Girod, 2013 Stanford University -- Eigenimages 8


Eigenimages for recognition
n To recognize complex patterns (e.g., faces), large portions of an image have to
be considered
n High dimensionality of image space means high computational burden for
many recognition techniques
Example: nearest-neigbor search requires pairwise comparison with every image in a database

n Transform c = Wf
can reduce dimensionality from N to J by representing the
image by J coefficients
n Idea: tailor a KLT to a specific set of training images
representative of the recognition task
to preserve the salient features

Digital Image Processing: Bernd Girod, 2013 Stanford University -- Eigenimages 9


Eigenimages for recognition

f c
W
f
+
f -
Rejection
Normalization Projection Similarity
1
New Face measure
!T !
Image (e.g., c pk* )

f
p1

Class of most k*

similar pk


!
pK Recognition
Mean Face Result
Database of Similarity
Eigenface Matching
Coefficients

Digital Image Processing: Bernd Girod, 2013 Stanford University -- Eigenimages 10


Computing eigenimages from a training set
n How to obtain NxN covariance matrix?
! ! !
,
l Use training set 1 2 ,, L+1
(each column vector represents one image)
l Let be the mean image of all L+1 training images
! "! ! "! ! "! ! "!
1 2 (
l Define training set matrix S = , , ,, ,
3 L )
L ! "! ! "! H
and calculate scatter matrix R =
l=1
( )(
l l ) = SS H

Problem 1: Training set size should be L + 1 >> N


If L < N, scatter matrix R is rank-deficient
Problem 2: Finding eigenvectors of an NxN matrix.

n Can we find a small set of the most important eigenimages


from a small training set L << N ?

Digital Image Processing: Bernd Girod, 2013 Stanford University -- Eigenimages 11


Sirovich and Kirby algorithm
n Instead of eigenvectors of SS H , consider the eigenvectors of S H S, i.e.,
H
S Svi = i vi
n Premultiply both sides by S
! !
=
SS H Svi = i Svi

n By inspection, we find that Svi are eigenvectors of SS H

Sirovich and Kirby Algorithm (for L << N )


l Compute the LxL matrix SHS

l Compute L eigenvectors vi of SHS
l Compute eigenimages corresponding to the L0 L largest eigenvalues
as a linear combination of training images Svi

L. Sirovich and M. Kirby, "Low-dimensional procedure for the characterization of human faces,"
Journal of the Optical Society of America A, 4(3), pp. 519-524, 1987.

Digital Image Processing: Bernd Girod, 2013 Stanford University -- Eigenimages 12


Example: eigenfaces
n The first 8 eigenfaces obtained from a training set of 100 male and 100 female
training images

Eigenface 1 Eigenface 2 Eigenface 3 Eigenface 4

Mean Face

Eigenface 5 Eigenface 6 Eigenface 7 Eigenface 8

n Can be used to generate faces by adjusting 8 coefficients.


n Can be used for face recognition by nearest-neighbor search in 8-d face space.

Digital Image Processing: Bernd Girod, 2013 Stanford University -- Eigenimages 13


Gender recognition using eigenfaces
Nearest neighbor search face space

Female face samples

Male face samples

Digital Image Processing: Bernd Girod, 2013 Stanford University -- Eigenimages 14


Fisher linear discriminant analysis
n Eigenimage method maximizes scatter within the linear subspace over the
entire image set regardless of classification task

Wopt = arg max det WRW H


W
( ( ))
n Fisher linear discriminant analysis (1936): maximize between-class scatter, while
minimizing within-class scatter
!"
! !" !"! !"
( )( )
c H
RB = N i i i

( )
i=1
det WR W H
B
Wopt = arg max
( )
Samples
Mean in class i
W det WRW W H in class i




( )( )
c H
RW = l i l i
i=1 Class(i)
l

Digital Image Processing: Bernd Girod, 2013 Stanford University -- Eigenimages 15


Fisher linear discriminant analysis (cont.)
!"
!
n Solution: Generalized eigenvectors wi corresponding to the
J largest eigenvalues {i | i = 1,2,..., J }, i.e.
!"
! !"
!
RB wi = i RW wi , i = 1,2,..., J
!"
! !"
!
n solve eigen-problem on this: R 1
W (
RB wi = i wi , i = 1,2,..., J )
n Problem: within-class scatter matrix Rw at most of rank L-1 (for L images total
in all classes combined), hence usually singular.
n Apply KLT first to reduce dimensionality of feature space to L-1 (or less),
proceed with Fisher LDA in lower-dimensional space

Digital Image Processing: Bernd Girod, 2013 Stanford University -- Eigenimages 16


Eigenimages vs. Fisherimages
2-d example: f2

Goal: project samples on


a 1-d subspace, then perform
classification.

f1

Digital Image Processing: Bernd Girod, 2013 Stanford University -- Eigenimages 17


Eigenimages vs. Fisherimages
2-d example: f2 KLT

Goal: project samples on


a 1-d subspace, then perform
classification.

The KLT preserves


maximum energy, but
f1
the 2 classes are no
longer distinguishable.

Digital Image Processing: Bernd Girod, 2013 Stanford University -- Eigenimages 18


Eigenimages vs. Fisherimages
2-d example: f2 KLT

Goal: project samples on


a 1-d subspace, then perform
classification.

The KLT preserves


maximum energy, but
f1
the 2 classes are no
longer distinguishable.

Fisher LDA separates the
classes by choosing
a better 1-d subspace.

Fisher LDA

Digital Image Processing: Bernd Girod, 2013 Stanford University -- Eigenimages 19


Fisherimages and varying iIllumination
Differences due to varying illumination can be
much larger than differences among faces!

Digital Image Processing: Bernd Girod, 2013 Stanford University -- Eigenimages 20


Fisherimages and varying iIllumination
n All images of same Lambertian surface with different
illumination (without shadows) lie in a 3d linear subspace
n Single point source at infinity Light source
intensity
surface !
normal n ! !T !
l
light source
( )
f ( x, y ) = a ( x, y ) l n ( x, y ) L
direction

Surface
albedo

n Superposition of arbitrary number of point sources at infinity


still in same 3d linear subspace, due to linear superposition
of each contribution to image
n Fisherimages can eliminate within-class scatter

Digital Image Processing: Bernd Girod, 2013 Stanford University -- Eigenimages 21


Side Note: Photometric Stereo!
observed normalized
intensity! lighting direction!
N
L I = L N
albedo! normalized
(constant)! surface normal!

diffuse (Lambertian) surfaces are viewpoint independent!

[Woodham 1980]!
Side Note: Photometric Stereo!

N Nx
L
I = Lx Ly Lz N y

N
z

diffuse (Lambertian) surfaces are viewpoint independent!

[Woodham 1980]!
Side Note: Photometric Stereo!

L( 2 ) L(1) N
N
I (1) L (1)
L(1)

L(1) L ( 3) x y z
x
(2) L(2) Ny
I = L(2) L(2)

x y z
I (3)
Lx
(3)
L (3)
L(3)
N z
y z

=! I =! L

diffuse (Lambertian) surfaces are viewpoint independent!


1
assume albedo is constant, invert matrix! N=L I [Woodham 1980]!
input! Side Note: Photometric Stereo!

output: recovered normals!


[Woodham 1980]!
Fisherface trained to recognize gender

Mean image Female mean Male mean


! ! !
1 2
Female face samples

Male face samples


Fisherface

Digital Image Processing: Bernd Girod, 2013 Stanford University -- Eigenimages 26


Gender recognition using 1st Fisherface

Error rate = 6.5%

Digital Image Processing: Bernd Girod, 2013 Stanford University -- Eigenimages 27


Gender recognition using 1st eigenface

Error rate = 19.0%

Digital Image Processing: Bernd Girod, 2013 Stanford University -- Eigenimages 28


Person identification with Fisherfaces and eigenfaces

ATT Database of Faces


40 classes
10 images per class

Digital Image Processing: Bernd Girod, 2013 Stanford University -- Eigenimages 29

Вам также может понравиться