Tải bản đầy đủ (.pptx) (20 trang)

File powpoint nhận dạng khuôn mặt sử dụng thuật toán PCA

Bạn đang xem bản rút gọn của tài liệu. Xem và tải ngay bản đầy đủ của tài liệu tại đây (1.18 MB, 20 trang )

Face recognition using PCA
DANG THE HUONG
VINH UNIVERSITY


CONTENTS






IDEA
OPERATIONS
MERITS
DEMERITS
APPLICATIONS


IDEA
PCA

Eigenfaces: the idea

Eigenvectors and Eigenvalues

Learning Eigenfaces from training sets of faces

Co-variance

Recognition and reconstruction




PCA
PCA means Principle Component Analysis.

PCA was invented in 1901 by Karl Pearson

PCA involves the calculation of the eigenvalue decomposition of a data covariance matrix or
singular value decomposition of a data matrix, usually after mean centering the data for each attribute.


Algorithm
Three basic steps involved in PCA are:
Identification
{by eigen faces}
Recognition
{matching eigen faces}
Categorization
{by grouping}


EIGEN VECTORS
In Digital Image Processing, we convert 2-D images into matrix form for clear analysis.
Every matrix can be represented with the help of its eigen vectors.
An eigenvector is a vector that obeys the following rule:

µ

Av = µ v


Where A is a matrix ,

 2 3
A=

2
1



e.g.

is a scalar (called the eigenvalue)

 3since

ν = 
2

one eigenvector of is

 2 3  3  12 
 3
 2 1  2  =  8  = 4 ×  2 

   
 
so for this eigenvector of this matrix the eigenvalue is 4



EIGEN FACES
Think of a face as being a weighted combination of some “component” or “basis” faces

These basis faces are called eigen faces.

-8029

2900

1751

1445

4238

6193


Eigenfaces: representing faces
 a1 

÷
a
2
÷
=
M ÷

÷
÷

a
 N2 

 d1 

÷
d
2 ÷
=
M ÷

÷
÷
d
2
 N 

 b1 

÷
b
2 ÷
=
M÷

÷
÷
b
 N2 


 c1 

÷
c
2
÷
=
M÷

÷
÷
c
2
 N 

 e1 

÷
e
2
÷
=
M÷

÷
÷
e
2
 N 




=




f1 
÷
f2 ÷
M ÷
÷
fN2 ÷



We compute the average face

 a1 + b1 + L + h1 

÷
r 1  a2 + b2 + L + h2 ÷
m=
,
M ÷
MM M

÷
÷
a

2 +b 2 +L + h 2
N
N 
 N

where M = 8


Then subtract it from the training faces
 a1 − m1 
 b1 − m1 
 c1 − m1 
 d1 − m1 

÷ r 
÷

÷ r 
÷
b2 − m2 ÷ r  c2 − m2 ÷
d 2 − m2 ÷
r  a2 − m2 ÷


am =
, bm =
, cm =
, dm =
,
M

M
M
M
M ÷
M ÷
M ÷
M ÷

÷

÷

÷

÷
÷
÷
÷
÷
a
m
b
m
c
m
d
m
2 −
2
2 −

2
2 −
2
2 −
2
N 
N 
N 
N 
 N
 N
 N
 N
 e1 − m1 

÷
e

m
r  2
2 ÷
em =
,
M
M ÷

÷
÷
e
m

2 −
2
N 
 N


r 
fm =




f1 − m1 
 g1 − m1 
 h1 − m1 
÷

÷ r 
÷
f 2 − m2 ÷ r
g

m
h

m
2
2 ÷
2
2 ÷

, gm = 
, hm = 
M
M
M
M ÷
M ÷
M ÷
÷

÷

÷
÷
÷
÷
f N 2 − mN 2 
g
m
h
m
2 −
2
2 −
2
N 
N 
 N
 N



2
Now we build the matrix which is N by M

r r r r r r r r
A =  am bm cm d m em f m g m hm 
2
2
The covariance matrix which is N by N

Cov = AA

Τ


The covariance matrix has eigenvectors
covariance matrix

eigenvectors

eigenvalues

Eigenvectors with larger eigenvectors correspond to

directions in which the data varies more

Finding the eigenvectors and eigenvalues of the
covariance matrix for a set of data is termed
principle components analysis


.617 .615 
C=

.615 .717 
 −.735
ν1 = 

 .678 

µ1 = 0.049

The covariance of two variables is:

n

.678
ν2 = 

.735

µ 2 = 1.284

cov( x1 , x2 ) =

i
i
(
x

x

)(
x
∑ 1 1 2 −x2 )
i =1

n −1


Recognition
A face image can be projected into this face space by
T
pk = U (xk – m) where k=1,…,m

To recognize a face

Subtract the average face from it

 r1 
 ÷
r2
= ÷
M ÷
 ÷
÷
 rN 2 

 r1 − m1 

÷
r


m
r  2
2 ÷
rm =
M M ÷

÷
÷
 rN 2 − mN 2 


r
Ω = U ( rm )
Τ

Compute its projection onto the face space U

Compute the distance in the face space between the face and all
known faces

Compute the threshold

{

ε = Ω − Ωi
2
i

1

θ = max Ωi − Ω j
2

}

2

for i = 1.. M

for i, j = 1.. M


Distinguish between



If

ξ ≥then
θ it’s not a face; the distance between the face and its reconstruction is

larger than threshold




If
If

then it’s a new face


ξ < θ and min { εthen
i} <
it’sθ
a known face because the distance in the face

ξ < θ and ε ≥ θ , (i = 1.. M )

space between the face andi all known faces is larger than threshold


RECONSTRUCTION
Image is reconstructed in the 3

rd

case, if

ξ < θ and ε i ≥ θ , (i = 1.. M )

Using the MATLAB code, original image and reconstructed image are shown.

Ex:


MERITS
Relatively simple
Fast
Robust
Expression

- Change in feature location and shape.


DEMERITS
Variations in lighting conditions
Different lighting conditions for enrolment and query.
Bright light causing image saturation.


APPLICATIONS:
Various potential applications, such as





Person identification.
Human-computer interaction.
Security systems.


Thank You



×