Professional Documents
Culture Documents
Biometrics
http://scgwww.epfl.ch/courses
Dr. Andrzej Drygajlo Speech Processing and Biometrics Group Signal Processing Institute Ecole Polytechnique Fdrale de Lausanne (EPFL)
Face Recognition
The transformed PCA parameters are orthogonal The PCA, diagonalizes the
covariance matrix, and the resulting diagonal elements are the variances of the transformed PCA parameters
PCA
Advantages
It completely decorrelates any data in the transform domain It packs the most energy (variance) in the fewest number of
transform coefficients It minimizes the MSE (mean square error) between the reconstructed and original data for any specified data compression It minimizes the total entropy of the data
Disadvantages
There is not fast algorithm for its implementation The PCA is not a fixed transform, but has to be generated for each
type of data statistic There is considerable computational effort involved in generation of eigenvalues and eigenvectors of the covariance matrices
PCA
PCA
r The covariance (scatter) matrix of the data x , which encodes the
variance and covariance of the data, is used in PCA to find the optimal rotation of the parameter space PCA finds the eigenvectors and eigenvalues of the covariance matrix. These have the property that
where: S - covariance (scatter) matrix r r r - eigenvectors W = [ w1 , w2 ,K , wD ] V - transformed covariance matrix (diagonal scatter matrix of eigenvalues) diag( V ) = v = [v1 , v2 ,K , vD ] - eigenvalues
WT SW = V
Example:
0 0.26 0.96 14492.28 20760.14 0.26 0.96 302.84 = 0.96 0.26 20760.14 14492.28 0.96 0.26 94.40 0
WT
PCA
Having found the eigenvectors and eigenvalues,
the principal components are found by the following transformation:
r r T xPCA = W x
Example:
xPCA,1 0.26 x1 0.96 x2 0.26 0.96 x1 x = 0.96 x + 0.26 x = 0.96 0.26 x 2 1 2 PCA,2
The eigenvectors give an idea of the importance of each of the original parameters in accounting for the variance in the data
PCA
A face image defines a point
in the high-dimensional image space Different face images share a number of similarities with each other
They can be described by a relatively low-dimensional subspace They can be projected into an appropriately chosen subspace of eigenfaces and classification can be performed by similarity computation (distance)
x2
xPCA,2
xPCA,1
x1
Graphs from: C. Sanderson, On Local Features for Face Verification, IDIAPRR 04-36
PCA
Suppose data consists of M faces with D feature values
10
... 1) Place data in D x M matrix x x = ... xij ... 2) Mean-center the data ... Compute D-dimensional (mean). x0 = x - 3) Compute D x D covariance matrix ( c = x0 x0T) 4) Compute eigenvectors and eigenvalues of covariance matrix 5) Choose K largest eigenvalues (K << D). 6) Form a D x K matrix W with K columns of eigenvectors. 7) The new coordinates xPCA of data (in PCA space) consists of projecting data into K-dimensional subspace by xPCA = WT(x )
M faces
D features
PCA
PCA seeks directions that are efficient for
representing the data
11
not efficient
efficient
Class A Class B
Class A Class B
12
The database
a1 a 2 = M a 2 N
e1 e2 = M e 2 N
b1 b2 = M b 2 N
f1 f2 = M f 2 N
c1 c2 = M c 2 N
g1 g2 = M g 2 N
d1 d2 = M d 2 N
h1 h2 = M h 2 N
13
m1 m r 2 1 m= M M mN 2
a1 + b1 + L + h1 a + b +L + h 2 2 2 , M M M a 2 + b 2 +L + h 2 N N N
where M = 8
14
15
r r r r r r r r x = am bm cm d m em f m g m hm
C = xx
The matrix is very large The computational effort is very big
S = x x
T
16
( ) ( )
( ) ( )
r dm , r hm
for
i, j = 1, 2,K, M
17
To recognize a face
r1 r2 = M r 2 N
18
r r xPCA = W ( rm )
r r = xPCA xPCA,i
2 i
for
i = 1, 2,K , M
19
r r rPCA = W xPCA
2 2
r r = rm rPCA
Eigenfaces
20
= 0.4
+ 0.2
+ ... + 0.6
= 0.4
+ 0.2
Eigenfaces
Shortcomings:
21
Eigenfaces do not distinguish between shape and appearance PCA does not use class information: PCA projections are optimal for reconstruction from a low dimensional basis, they may not be optimal from a discrimination standpoint: Much of the variation from one image to the next is due to illumination changes. [Moses, Adini, Ullman]
Different illumination Different head pose Different alignment Different facial expression
PCA
22
Database Samples
PCA
23
Average
Major (principal)
Minor
PCA
24