face recognition and detection using principal component analysis pca kh wong face recognition &...
TRANSCRIPT
Face recognition and detection using Principal Component Analysis
PCA
KH Wong
Face recognition & detection using PCA v.4a 1
Overview
• PCA Principle Component Analysis• Application to face detection and recognition • Reference: [Bebis ]• Applications
– 1) Face detection– 2) Face recognition
Face recognition & detection using PCA v.4a
2
PCA Principle component analysis [1]
• A method of data compression• Use less data in “b” to
represent ‘”a” but retain important information
• E.g N=10, 10 dimensions reduced to K=5 dimensions.
• The task is to identify the N important parameters.
• So recognition is easier.
KN
b
b
b
b
a
a
a
a
KN
where,..
:
: 2
12
1
Face recognition & detection using PCA v.4a
3
A detailed Christmas tree: N parameters
A rough Christmas tree with K important parameters, where N>K.
Data reduction N dataK data (N>K), how? •
Face recognition & detection using PCA v.4a
4
NKNKKK
NN
NN
NKNKK
N
N
K
KNKK
N
N
atatatb
atatatb
atatatb
KN
a
a
a
ttt
ttt
ttt
b
b
b
ttt
ttt
ttt
T
...
:
...
...
are equations ussimultaneo the
, and
:
:
..
::::
..
..
..
..
::::
..
..
2211
22221212
12121111
2
1
11
22221
11111
2
1
11
22221
11111
k
Nwhere
a
a
a
T
b
b
b
a
a
a
a
b
b
b
b
aTb
NK
NK
,
:
:..
therefore,
:
: ,..
put ,*
2
1
2
1
2
1
2
1
k
N
Dimensionality basis
• space ldimensiona-N theof basis theare ,,...,,
...
basis has space ldimensionahigher The
21
2211
,.....,2,1
N
NN
Ni
uuu
uauauax
u
KN
Kvvv
vbvbvbx
v
K
KK
Ki
space ldimensiona- theof basis theare ,,...,,
...............ˆ
basis has space ldimensionalower The
21
2211
,..,2,1
Face recognition & detection using PCA v.4a
5
xxKN
xx
ˆ then if :Note
ˆ
vector themakecan We
Condition for data reduction:data must not be random
•
Pca.pdf
Face recognition & detection using PCA v.4a
6
x1
x2
Data are spread all over the 2D space, so redundancy of using 2 axes (x1,x2) is low
x1
x2
Data are spread along one line , so redundancy of using 2 axes is high.Can consider to use one axis (U1) along the spread of data to represent it. Although some error may be introduced.
u1
Data reduction (Compression) is difficult for random data
Compression is easy for non-random data
The concept• In this diagram, the data is
not entirely random.• Transform the data from
(u1,u2) to (v1,v2). • Approximation is done by
ignoring the axis u2, because the variation of data in that axis is small.
• We can use a one-dimensional space (basis v1) to represent the dots.
errorxx jj ˆ because
lost. bemay n informatio someBut
Face recognition & detection using PCA v.4a
7
u1
u2
v1
jj
j
j
xx
v
x
,uu
x
ˆ
}.basis{with
space the toaccording ˆOr
}{ basis with space the to
according ispoint This
1
21
u2
How to compress data?
• The method is to find a transformation (T) of data from (u1,u2) space to (v1,v2) space and remove the v2 coordinate.
• The whole method is called Principal Component Analysis PCA.
• This transformation (T) depends on the data and is called Eigen vectors.
Face recognition & detection using PCA v.4a
8
u1
u2
v1
jj
j
j
xx
x
,uu
x
ˆ
}.basis{vwith
space the toaccording ˆOr
}{ basis with space the to
according ispoint This
1
21
u2
PCA will enable information lost to be minimized
• Use Covariance matrix method to find relation between axes (u1,u2).
• Use Eigen value method to find the new axes.
Face recognition & detection using PCA v.4a
9
u1
u2
PCA Algorithm: Tutorial in [smith 2002]Proof is in Appendix by [Shlens 2005]
• Step1: – get data
• Step2: – subtract the mean
• Step 3:– Find Covariance matrix C
• Step 4: – find Eigen vectors and Eigen values of C
• Step5: – Choosing the large feature components (the main axis).
Face recognition & detection using PCA v.4a
10
Some math background
• Mean• Variance/ standard deviation• Covariance• Covariance matrix
Face recognition & detection using PCA v.4a
11
Mean, variance (var) and standard_deviation (std)
• x =• 2.5000• 0.5000• 2.2000• 1.9000• 3.1000• 2.3000• 2.0000• 1.0000• 1.5000• 1.1000• mean_x = 1.8100• var_x = 0.6166• std_x = 0.7852
)var()(
)1(
1)var(
1
2
1
1
xxstd
xn
x
xn
xmean
n
ii
n
ii
Face recognition & detection using PCA v.4a
12
%matlab codex=[2.5 0.5 2.2 1.9 3.1 2.3 2 1 1.5 1.1]' mean_x=mean(x)var_x=var(x)std_x=std(x) x
sample
N or N-1 as denominator??see
http://stackoverflow.com/questions/3256798/why-does-matlab-native-function-cov-covariance-matrix-computation-use-a-differe
• “n-1 is the correct denominator to use in computation of variance. It is what's known as Bessel's correction” (http://en.wikipedia.org/wiki/Bessel%27s_correction) Simply put, 1/(n-1) produces a more accurate expected estimate of the variance than 1/n
Face recognition & detection using PCA v.4a
13
Class exercise 1
By computer (Matlab)• x=[1 3 5 10 12]'• mean(x)• var(x)• std(x)• Mean(x)• = 6.2000• Variance(x)= 21.7000• Stand deviation = 4.6583
By and• x=[1 3 5 10 12]'• mean=• Variance=• Standard deviation=
Face recognition & detection using PCA v.4a
%class exercise1x=[1 3 5 10 12]'
mean(x)var(x)std(x)
14
)var()(
)1(
1)var(
1
2
1
1
xxstd
xn
x
xn
xmean
n
ii
n
ii
Answer1:
By computer (Matlab)• x=[1 3 5 10 12]'• mean(x)• var(x)• std(x)• Mean(x)• = 6.2000• Variance(x)= 21.7000• Stand deviation = 4.6583
By and• x=[1 3 5 10 12]'• mean=(1+3+5+10+12)/5• =6.2• Variance=((1-6.2)^2+(3-
6.2)^2+(5-6.2)^2+(10-6.2)^2+(12-6.2)^2)/(5-1)=21.7
• Standard deviation= sqrt(21.7)= 4.6583
Face recognition & detection using PCA v.4a
)var()(
)1(
1)var(
1
2
1
1
xxstd
xn
x
xn
xmean
n
ii
n
ii
15
Covariance [see wolfram mathworld]
• “Covariance is a measure of the extent to which corresponding elements from two sets of ordered data move in the same direction.”
• http://stattrek.com/matrix-algebra/variance.aspx
N
i
ii
nn
N
yyxxYX
y
y
Y
x
x
X
1
11
1),(covariance
:,:
Face recognition & detection using PCA v.4a
16
Covariance (Variance-Covariance) matrix”Variance-Covariance Matrix: Variance and covariance are often displayed together in a variance-
covariance matrix. The variances appear along the diagonal and covariances appear in the off-diagonal elements”, http://stattrek.com/matrix-algebra/variance.aspx
•
N
icicCiC
N
iiCiC
N
iCiCiC
N
icici
N
iii
N
iii
N
icici
N
iii
N
iii
cc
Nc
c
c
Cccc
XxXxXxXxXxXx
XxXxXxXxXxXx
XxXxXxXxXxXx
N
YX
XmeanX
x
x
X
N,..,X,XXC
1,,
12,2,
1,1,
1,2,2
12,22,2
11,12,2
1,1,1
12,21,1
11,11,1
,
1,
21
..
::::
..
..
)1(
1
),(_matrixcovariance
)(,:
entries. has Each.data of sets have youAssume
Face recognition & detection using PCA v.4a
17
Xc
c=1 c=2 c=C
N
math v4c 18
Covariance matrix example1A is 4x3
• From Matlab >help cov• Consider • A = [-1 1 2 ; • -2 3 1 ; • 4 0 3 ;• 1 2 0]• To obtain a vector of variances for
each column of A: • v = diag(cov(A))'• v =• 7.0000 1.6667 1.6667• Compare vector v with covariance
C=cov(A);• C=[7.0000 -2.6667 1.6667• -2.6667 1.6667 -1.3333• 1.6667 -1.3333 1.6667]
• Ie. Take the first column of A• a=[-1,-2,4,1]’• a2=a-mean(a)• a2=[-1,-2,4,1]’-0.5=[-1.5000,-2.5000, 3.5000,
0.5000]’• Cov([-1,-2,4,1]’)=7• Cov(a)=7• a2’*a2/(N-1)=• [-1.5000,-2.5000,3.5000,0.5000]*• [-1.5000,-2.5000,3.5000,0.5000]’/(4-1)• =7• Diagonals are variances of the columns• Covariance of first and second column• >> cov([-1,-2,4,1]',[1,3,0,2]')=• 7.0000 -2.6667• -2.6667 1.6667 • Also• >> cov([1,3,0,2]',[2,1,3,0]') =• 1.6667 -1.3333• -1.3333 1.6667
Face recognition & detection using PCA v.4a 19
Covariance matrix example2A is 3x3
• From Matlab >help cov• Consider • A = [-1 1 2 ; • -2 3 1 ; • 4 0 3]. To obtain a vector of
variances for each column of A: • v = diag(cov(A))'• v =• 10.3333 2.3333 1.0000• Compare vector v with covariance
matrix C: • C =• 10.3333 -4.1667 3.0000• -4.1667 2.3333 -1.5000• 3.0000 -1.5000 1.0000
• Ie. Take the first column of A• a=[-1,-2,4]’• a2=a-mean(a)• a2=[-1,-2,4]’-0.333=[-1.3333 -2.3333
3.6667]’• Cov([-1,-2,4]’)=• Cov(a)=• a2’*a2/(N-1)=• [-1.3333 -2.3333 3.6667]’• *[-1.3333 -2.3333 3.6667]/(3-1)• =10.333• Diagonals are variances of the columns• Covariance of first and second column• >> cov([-1 -2 4]',[1 3 0]')=• 10.3333 -4.1667• -4.1667 2.3333• Also• >> cov([1 3 0]',[2 1 3]') =• 2.3333 -1.5000• -1.5000 1.0000
Face recognition & detection using PCA v.4a 20
Covariance matrix example• From Matlab >help cov• Consider • A = [-1 1 2 ; • -2 3 1 ; • 4 0 3]. To obtain a vector of
variances for each column of A: • v = diag(cov(A))'• v =• 10.3333 2.3333 1.0000• Compare vector v with covariance
matrix C: • C =• 10.3333 -4.1667 3.0000• -4.1667 2.3333 -1.5000• 3.0000 -1.5000 1.0000• N=3, because A is 3x3
• Ie. Take the first column of A• a=[-1,-2,4]’• a2=a-mean(a)• a2=[-1,-2,4]’-0.333=[-1.3333 -2.3333 3.6667]’• b=[1 3 0]’• b2=[1 3 0]’-mean(b)=• b2= [-0.3333 , 1.6667, -1.3333]’• a2’*b2/(N-1)=[-1.3333 -2.3333 3.6667]*[-
0.3333 , 1.6667, -1.3333]’• = -4.1667• ------------------------------------------• C=[2 1 3]’• C2=[2 1 3]’-mean(c)• C2=[2 1 3]’-2=[0 -1 1]’• a2’*c2/(N-1)=[-1.3333 -2.3333 3.6667]*[0 -1
1]’/(3-1)=3• -----------------------------------• b2*b2’/(N-1)=[-0.3333 , 1.6667, -1.3333]*[-
0.3333 , 1.6667, -1.3333]’/(3-1)=2.3333• b2*c2/(N-1)= [-0.3333 , 1.6667, -1.3333]*[0 -1
1]’/(3-1)=-1.5
Eigen vector of a square matrix
•
Face recognition & detection using PCA v.4a
21
covariance_matrix of X =cov_x=
[0.6166 0.6154] [0.6154 0.7166]
eigvect of cov_x = [-0.7352 0.6779] [ 0.6779 0.7352]
eigval of cov_x = [0.0492 0] [ 0 1.2840 ]
Because A is rank2 and is 2x2cov_x * X= X,so cov_x has 2 eigen values and 2 vectorsIn Matlab[eigvec,eigval]=eign(cov_x)
Soeigen value 1= 0.49, its eigen vector is [-0.7352 0.6779]eigen value 2= 1.2840, its eigen vector is [0.6779 0.7352]
Square matrix
To find eigen values
• •
.1.2840 ,0.0492 are eseigen valu
2
)61540*6154071660*61660(*42)^6166071660(
2/)6166071660(2
)(4)()(
7166061540
6154061660
if So2
)(4)()(
equation quadratic thisosolution t
0)()(
21
2
2
2
λλ
......
..λ
bcadadadλ
..
..
dc
ba
bcadadadλ
bcadλadλ
Face recognition & detection using PCA v.4a
22
bcλdλadaλ
dλ
b
c
λa
iii
iixdλcx
λxdxcx
ibxxλa
λxbxax
x
xλ
x
x
dc
ba
x
xλ
x
xA
2
21
221
21
121
2
1
2
1
2
1
2
1
)(
)(
)/()(
)()(
)()(
What is an Eigen vector?
• AX=X (by definition)• A=[a b • c d]• is the Eigen value and is a scalar.• X=[x1 • x2]• The direction of Eigen vectors of A will not be changed by
transformation A.• If A is 2 by 2, there are 2 Eigen values and 2 vectors.
Face recognition & detection using PCA v.4a
23
Find eigen vectors from eigen values1=0.0492, 2=1.2840, for 1
7166061540
6154061660by changed benot will
6779.0
7352.0 ofdirection The
6779.0
7352.00492.0
6779.0
7352.0
7166061540
6154061660
meansThat
6779.0
7352.0
is 0.0492)( eeigen valufor or eigen vect the
equation, above thesolve
0492.07166061540
6154061660
is for or eigen vect
.1.2840 ,0.0492
2
1
2
1
1
2
1
2
1
2
11
2
1
2
11
21
..
..
dc
ba
x
x
..
..
x
x
x
x
x
x
..
..
x
xλ
x
x
dc
ba
x
xλ
λλ
•
Face recognition & detection using PCA v.4a
24
Find eigen vectors from eigen values1=0.0492, 2=1.2840, for 2
•
•
Face recognition & detection using PCA v.4a
25
7166061540
6154061660by changed benot will
7352.0
6779.0 ~
~ofdirection The
7352.0
6779.01.2840
7352.0
6779.0
7166061540
6154061660
meansThat
7352.0
6779.0~
~ is 1.2840)( eeigen valufor or eigen vect the
equation above thesolve
~
~1.2840~
~
7166061540
6154061660
~
~
~
~
~
~ is for or eigen vect
2
1
2
1
2
2
1
2
1
2
12
2
1
2
12
..
..
dc
ba
x
x
..
..
x
x
λ
x
x
x
x
..
..
x
xλ
x
x
dc
ba
x
xλ
Cov numerical example (pca_test1.m, in
appendix)
•
Step2:• X_data_adj =• X=Xo-mean(Xo)=• =[x1 x2]= • [0.6900 0.4900• -1.3100 -1.2100• 0.3900 0.9900• 0.0900 0.2900• 1.2900 1.0900• 0.4900 0.7900• 0.1900 -0.3100• -0.8100 -0.8100• -0.3100 -0.3100• -0.7100 -1.0100]• Mean is (0,0)
Face recognition & detection using PCA v.4a
26
Step1: Original data =Xo=[ xo1 xo2]= [2.5000 2.4000 0.5000 0.7000 2.2000 2.9000 1.9000 2.2000 3.1000 3.0000 2.3000 2.7000 2.0000 1.6000 1.0000 1.1000 1.5000 1.6000 1.1000 0.9000]Mean 1.81 1.91(Not 0,0) Eigen vector with
small eigen value
Eigen vector with Large eigen value
Step4:eigvects of cov_x = -0.7352 0.6779 0.6779 0.7352
eigval of cov_x = 0.0492 0 0 1.2840
Small eigen valueLarge eigen value
Step3:Covariance_matrix of X =cov_x=
0.6166 0.6154 0.6154 0.7166
Data is biased in this 2D space (not random) so PCA for data reduction will work. We will show X can be approximated in a 1-D space with small data lost.
x1
x2 x’1
x’2
Step 5:Choosing eigen vector (large feature component) with large eigen valuefor transformation to reduce data
•
00
779.06779.0
0
d transpose(X) covariance of eeigen valubiggest eor with thEigen vect__
6779.07352.0
779.06779.0
d transpose(X) covariance of eeigen valubiggest secondor with Eigen vect
d transpose(X) covariance of eeigen valubiggest eor with thEigen vect__
:choices twohaveYou
cov(X) ofor eigen vectan is roweach with
data new transposed'
adjusteddata_mean_ Original
'
recapproxT
recfullyT
TT
X
X
TXX
i
Face recognition & detection using PCA v.4a
27
eigvect of cov_x = -0.7352 0.6779 0.6779 0.7352
eigval of cov_x = 0.0492 0 0 1.2840 Small eigen value
Large eigen value
Eigen vector with small eigen value
Eigen vector with Large eigen value
Covariance matrix of Xcov_x =
0.6166 0.6154 0.6154 0.7166
Fully reconstruction case:For comparison only, no data lost
PCA algorithm will select this Approximate Transform P_approx_recFor data reduction
• X’_Fully_reconstructed • (use 2 eignen vectors) • X’_full=P_fully_rec_X • (two columns are filled)= • 0.8280 -0.1751• -1.7776 0.1429• 0.9922 0.3844• 0.2742 0.1304• 1.6758 -0.2095• 0.9129 0.1753• -0.0991 -0.3498• -1.1446 0.0464• -0.4380 0.0178• -1.2238 -0.1627• {No data lost, for comparaison only}
• X’_Approximate_reconstructed • (use 1 eignen vector) • X’_approx=P_approx_rec_X (the
second column is 0) =• 0.8280 0• -1.7776 0• 0.9922 0• 0.2742 0• 1.6758 0• 0.9129 0• -0.0991 0• -1.1446 0• -0.4380 0• -1.2238 0• {data reduction 2D 1 D, data lost
exist}Face recognition & detection using PCA
v.4a28
'
00
7352.06779.0
0
d transpose(X) covariance of eeigen valubiggest eor with thEigen vect__
6779.07352.0
7352.06779.0
d transpose(X) covariance of eeigen valubiggest secondor with Eigen vect
d transpose(X) covariance of eeigen valubiggest eor with thEigen vect__
TXX
recapproxT
recfullyT
What is the meaning of reconstruction?
•
Face recognition & detection using PCA v.4a
29
‘+’=Transformed valuesX=T_approx*X’
x’1 x’2 0.8280 -0.1751-1.7776 0.1429 0.9922 0.3844 0.2742 0.1304 1.6758 -0.2095 0.9129 0.1753-0.0991 -0.3498-1.1446 0.0464-0.4380 0.0178-1.2238 -0.1627
XX full
6779.07352.0
7352.06779.0)'(
X of x'2)(x'1, scompomnent all
using )(redtion reconstrucFully
XX approx
00
7352.06779.0)'(
only X' of (x'1) scompomnent all
using squares)(green
tion reconstruc Approimate
‘o’ are originaltrue values‘o’ and + overlapped 100%
x1
x2 x’1
x’2
X_data_adj =X=Xo-mean(Xo)= =[x1 x2]= [0.6900 0.4900 -1.3100 -1.2100 0.3900 0.9900 0.0900 0.2900 1.2900 1.0900 0.4900 0.7900 0.1900 -0.3100 -0.8100 -0.8100 -0.3100 -0.3100 -0.7100 -1.0100]Mean is (0,0)
Squares=Transformed valuesX=T_approx*X’
x’1 x’2 0.8280 0-1.7776 0 0.9922 0 0.2742 0 1.6758 0 0.9129 0-0.0991 0-1.1446 0-0.4380 0-1.2238 0
•
because ,''
'
00
7352.06779.0
0
d transpose(X) covariance of eeigen valubiggest eor with thEigen vect__
6779.07352.0
7352.06779.0
d transpose(X) covariance of eeigen valubiggest secondor with Eigen vect
d transpose(X) covariance of eeigen valubiggest eor with thEigen vect__
11 TT PPXPYPX
PXX
recapproxP
recfullyP
approxYrecapproxP
approxxtedreconstrucT _*__
.__
Face recognition & detection using PCA v.4a
30eigen vector with small eigen value (blue, too small to be seen)
‘+’=Recovered using all Eigen vectors
Same as original , so no lost of information
‘’=Recovered using one eigen vector that has the biggest eigen value(principal component)
Some lost of information
‘O’=Original data
eigen vector with large eigen value (red)
fullYrecfullP
fullxtedreconstrucT _*__
__
Some other test results using pca_test1.m (see appendix)Left) When x,y change together, first Eigen vector is longer than the second one.
Right) Similar to the left case, however, a slight difference at (x=5.0, y=7.8) make the second Eigen vector a little bigger
• x=[1.0 3.0 5.0 7.0 9.0 10.0]’• y=[1.1 3.2 5.8 6.8 9.3 10.3]'
• x=[1.0 3.0 5.0 7.0 9.0 10.0]' • y=[1.1 3.2 7.8 6.8 9.3 10.3]'
Face recognition & detection using PCA v.4a
31
x x
yy
x=rand(6,1);y=rand(6,1);
x
yRandom data, Two Eigen vectors have similar lengths
Correlated data , one Eigen vector is much larger than the second one (the second one is too small to be seen)
Correlated data , with some noise: one Eigen vector is larger than the second one
PCA algorithm
•
N
NN
NMT
MNTj
M
jjNN
MNM
NiNi
M
ii
M
uuu
AAMM
C
xx
xM
xStep
xxx
,..,,:C of vectorseigen find :Step5
..:C of valueseigen find:Step4
11
matrix covariance thefind
,..Amatrix form :Step3
:mean hesubtract t:Step2
1:1
vectors.1N are,...,,dataInput
21
N21
)(
)()(1
)(21
)1()1(
1
21
Face recognition & detection using PCA v.4a
32
Continue
•
K
k
K
iii
N
iiinn
N
b
b
b
,..,u,uuxx
NKubxx
Step
ububbubuxx
u,,uu
:
thusis basis theinto ˆ of ionrepresntat The
whereˆ
seigenvaluelargest K the
tongcorrespdni termsonly the step)keep reduction ldimensiona(:6
...actually
x any vector (i.e. basisa form .., symmetric is C Since
2
1
21
1
12211
21
Face recognition & detection using PCA v.4a
33
PCA
• Space dimension N reduced to dimension K• Ui are normalized unit vectors
)1()1(
)(
2
1
)1(
2
1
::
KT
N
NK
TK
T
T
KK
xxUxx
u
u
u
b
b
b
Face recognition & detection using PCA v.4a
34
Geometric interpretation– PCA transforms the coordinates along the spread of the data.
Here are axes: u1 and u2– The coordinates are determined by the eigenvectors of the
covariance matrix corresponding to the largest eigenvalues.– The magnitude of the eigenvalues corresponds to the variance
of the data along the eigenvector directions
Face recognition & detection using PCA v.4a
35x1
x2 u1u2
_x
Choose K
• > Threshold 0.95 will preserve 95 % information
• If K=N ,100% will be reserved (no data reduction)
• Data standardization is needed
deviation standard
)(
)(
use youBefore
2
1eerror
0.95) (e.g. Threshold
1
1
1
std
xstd
xmeanxx
x
i
iii
i
N
Kii
N
ii
K
ii
Face recognition & detection using PCA v.4a
36
Application1 for face detection
• Step1: obtain the training faces images I1,I2,…,IM (centered and same size)
• Each image is represented by a vector • Each image(a Nr x Nc matrix) (N2x1) vector
Face recognition & detection using PCA v.4a
37
Vector Nc x Nr=92x1192=10304
Nc=Ncolumn=92
Nr=Nrow=112
(1,1)
:
9292
92
Linearization example
•
Face recognition & detection using PCA v.4a
38
An input data Vector (i) is a vector of (Ntotal) x 1=10304 x 1 elements
X=92
Y=112
I(1,1) I(2,1) I(3,1)
I(1,2) I(2,2) ..
I(1,3) I(2,3) ..
I(1,4) I(2,4) ..
: I(2,5) ..
: :
: :
I(1,100)
I(2,100)
(1,1)I(x=1,y=1)
I(x=2,y=1)
I(x=3,y=1)
I(x=4,y=1)
:
I(1,51)
I(2,51)
I(2,51)
I(2,51)
:
I(2,100)
I(3,100)
:
Pixel=I(x,y)
Ntotal =112 x92=10304
92
A 2D array Nr X Nc =112x92=10304pixels
Pixels=I(x,y)
Collect many faces, for example M=300
• Each image is • Ntotal=Nr x Nc=112x92
• Linear each image becomes an input data vector i=Ntotalx1=10304 x 1
Face recognition & detection using PCA v.4a
39
http://www.cedar.buffalo.edu/~govind/CSE666/fall2007/biometrics_face_detection.pdf
Continue ( a special trick to make it efficient)• Collect training data (M=300 faces, each face image is 92 x
112=10304 pixels).• Linear each image becomes an input data vector
i=Ntotalx1=10304x1
• Find the covariance matrix C from as below
)(
)()(1
)()1()1(2)1(1
)1()1(
1)11(
221
22
2222
2222
22
11
of matrix covariance thefind
..Amatrix construct :Step5
:mean hesubtract t:Step4
1mean the:3
faces) M are there, is faceeach (because
vectors.1N of numbers M aredataInput
NN
T
MNMN
Tj
M
jjNN
MNNMNN
NiNi
M
ii
i
M
AAMM
C
AC
Γ
ΓM
Step
Γ
,...,Γ,ΓΓ
Face recognition & detection using PCA v.4a
40
Ntotal rows10,304
A
C= covariance of A (Ntotal x Ntotal)e.g. 10304x10304too large !
AA (N2xM)e.g. 10,304x300
Ntotal rows10,304
M=300
From a face i i(i=1300)
Continue: But: C (size of NtotalX Ntotal =10304 x 10304) is too large to be calculated , if Ntotal=Nr x Nc= 92 x 112 =10304
•
?)(__
and,)(__
between relation theis What
. of orseigen vect the Find:6.2 Step
then , images training300e.g.
)AAC ofeigen of (instead A A ofeigen find :Step6.1
-------:follows as is trick -The---
112921030410304 of Size
time,limitedin calculated be tolarge toois1
30010304..(
, of orseigen vect Find :Step6
)300300(
)1030410304(
300x300) (size
)xNN (sizeT
MxM) (sizeT
)(
22
MMT
i
NNT
i
Ti
T
totaltotal
T
iiiT
total
NNT
i
AAofvalueseigenv
AAofvalueseigenu
AAv
AAM
,Nr if NcNNC
AAM
C
uuAA
)M Nsize(A) isge
AAu
totaltotal
totaltotal
Face recognition & detection using PCA v.4a
41
Face recognition & detection using PCA v.4a
42
Continue
•
Avu
AAAASo
Avu
(i)(iii)
iiiAvAvAA
AvvAvAAA
A(ii)
AAii
iivvAA
iuuAA
)vu
ii
TT
iiii
iiiT
iiiiiiT
Ti
iiiT
iiiT
ii
: related are orseigen vect and
seigenvalue same thehave and,
and
with Compare
)(
scalar a is since,
by of sizeeach multiply
of (scalar) eeigen valuan is :)( from
)(
)(
and of(relation :Answer
?)(__
and,)(__
between relation theis What
)300300(
)1030410304(
MMT
i
NNT
i
AAofvalueseigenv
AAofvalueseigenutotaltotal
resultImportant
ii
ii
Avu
Face recognition & detection using PCA v.4a
43
Important results (e.g Ntotal=10304, M=300)
• (AAT)size=10304x10304 have Ntotal=10304 eigen vectors and eigen values
• (ATA) size=300x300 have M=300 eigen vectors and eigen values
• The M eigen values of (ATA) are the same as the M largest eigen values of (AAT)
resultImportant
ii
ii
Avu
•
Continue
•
casesmost in useful are
avlues)(eigen orsEigen vectbiggest 5) K(e.g Only the :Step7
1 so , normalize:
same theare of
eseigen valu and of eseigen valu first The---
of eseigen valu 300M and
orseigen vect 300Mbest theFind
:3.6
2121
i
30030021
300..2,1
ii
ii
,...M,i,...M,i
T
iT
MMT
,...M,i
i
uuNote
Avu
Also
AA
AAM
AA
u
Step
Face recognition & detection using PCA v.4a
44
Steps for training
• Training faces from 300 faces
• Find largest Eigen vectors ={u1,u2,..uk=5 }
• For each ui (a vector of size 10304 x 1, re-shape back to an image 112 x92), convert back to image . (In MATLAB use the function reshape)
Face recognition & detection using PCA v.4a
45
ii Avu
Eigen faces for face recognition• For each face, find the K (e.g.
K=5) face images (called Eigen faces) corresponding to the first K Eigen vectors () with largest Eigen values
• Use Eigen faces as parameters for face detection.
faceseigenu
uwuwmean
j
iTjj
K
jjji
_
ˆ,ˆ1
Face recognition & detection using PCA v.4a
46
http://onionesquereality.files.wordpress.com/2009/02/eigenfaces-reconstruction.jpg
Application 1: Face detection using PCA
• Use a face database to form as described in the last slide.
• Scan the input picture with different scale and positions, pick up windows and rescale to some convenient size , e.g 112 x 92 pixels=W(i)– W(i) Eigen face (biggest 5 Eigen vectors)
representation ((i)). – If |((i) - )|<threshold it is a face
Face recognition & detection using PCA v.4a
47
Application 2: Face recognition using PCA
• For a database of K persons– For each person j : Use many samples of a
person's face to train up a Eigen vector . E.g. M=30 samples of person j to train up j
– Same for j=1,2,…K persons• Unknown face Eigen face (biggest 5 Eigen
vectors) representation (un). • Test loop for j’=1…K
– Select the smallest |(un -j’)| , then this is face j’.
Face recognition & detection using PCA v.4a
48
References• Matlab code
– “Eigen Face Recognition” by Vinay kumar Reddy http://www.mathworks.com/matlabcentral/fileexchange/38268-eigen-face-recognition
– Eigenface Tutorial– http://www.pages.drexel.edu/~sis26/Eigenface%20Tutorial.htm
• Reading list– [Bebis] Face Recognition Using Eigenfaces www.cse.unr.edu/~bebis/CS485/Lectures/Eigenfaces.ppt
– [Turk 91] Turk and Pentland , “Face recognition using Principal component analysis” journal of Cognitive Neuroscience 391), pp71-86 1991.
– [smith 2002] LI Smith , "A tutorial on Principal Components Analysis”, http://www.cs.otago.ac.nz/cosc453/student_tutorials/principal_components.pdf
– [Shlens 2005] Jonathon Shlens , “ A tutorial on Principal Component Analysis”, http://www.cs.otago.ac.nz/cosc453/student_tutorials/principal_components.pdf
– [AI Access] http://www.aiaccess.net/English/Glossaries/GlosMod/e_gm_covariance_matrix.htm– http://www.pages.drexel.edu/~sis26/Eigenface%20Tutorial.htm
Face recognition & detection using PCA v.4a
49
Appendix: pca_test1.m• %pca_test1.m, example using data in [smith 2002] LI Smith,%matlab by khwong• %"A tutorial on Principal Components Analysis”, • %www.cs.otago.ac.nz/cosc453/student_tutorials/principal_components.pdf• %---------Step1--get some data------------------• function test• x=[2.5 0.5 2.2 1.9 3.1 2.3 2 1 1.5 1.1]' %column vector• y=[2.4 0.7 2.9 2.2 3.0 2.7 1.6 1.1 1.6 0.9]' %column vector• N=length(x)• %---------Step2--subtract the mean------------------• mean_x=mean(x), mean_y=mean(y)• x_adj=x-mean_x,y_adj=y-mean_y %data adjust for x,y• %---------Step3---cal. covariance matrix----------------• data_adj=[x_adj,y_adj]• cov_x=cov(data_adj)• %---------Step4---cal. eignvector and eignecalues of cov_x-------------• [eigvect,eigval]=eig(cov_x)• eigval_1=eigval(1,1), eigval_2=eigval(2,2)• eigvect_1=eigvect(:,1),eigvect_2=eigvect(:,2),• • %eigvector1_length is 1, so the eigen vector is a unit vector• eigvector1_length=sqrt(eigvect_1(1)^2+eigvect_1(2)^2)• eigvector2_length=sqrt(eigvect_2(1)^2+eigvect_2(2)^2)• • %sorted, big eigen_vect(big eignval first)• %P_full=[eigvect(1,2),eigvect(2,2);eigvect(1,1),eigvect(2,1)]• P_full=[eigvect_2';eigvect_1'] %1st eigen vector is small,2nd is large • P_approx=[eigvect_2';[0,0]]%keep (2nd) big eig vec only,small gone•
• figure(1)• clf• hold on• plot(-1,-1) %create the same diagram as in fig.3.1 of[smith 2002].• plot(4,4), plot([-1,4],[0,0],'-'),plot([0,0],[-1,4],'-')• hold on• title('PCA demo')• %step5: select feature• %eigen vectors,length of the eigen vector proportional to its eigen val• plot([0,eigvect(1,1)*eigval_1],[0,eigvect(2,1)*eigval_1],'b-')%1stVec• plot([0,eigvect(1,2)*eigval_2],[0,eigvect(2,2)*eigval_2],'r-')%2ndVec• title('eign vector 2(red) is much longer (bigger eigen value), so keep it')• • plot(x,y,'bo') %original data• %%full %%%%%%%%%%%%%%%%%%%%%%%%
%recovered_data_full=P_full*data_adj+repmat([mean_x;mean_y],1,N)• final_data_full=P_full*data_adj'• • recovered_data_full=P_full'*final_data_full+repmat([mean_x;mean_y],1,N)• %recovered_data_full=P_full*data_adj'+repmat([mean_x;mean_y],1,N)• plot(recovered_data_full(1,:),recovered_data_full(2,:),'r+')• • %%approx %%%%%%%%%%%%%%%%• %recovered_data_full=P_full*data_adj+repmat([mean_x;mean_y],1,N)• final_data_approx=P_full*data_adj'• • recovered_data_approx=P_approx'*final_data_approx+repmat([mean_x;mean_y],1,
N)• %recovered_data_full=P_full*data_adj'+repmat([mean_x;mean_y],1,N)• plot(recovered_data_approx(1,:),recovered_data_approx(2,:),'gs')•
Face recognition & detection using PCA v.4a
50
Appendix: pca_test1.m (cut and paste to matlab to run)
• %pca_test1.m, example using data in [smith 2002] LI Smith,%matlab by khwong• %"A tutorial on Principal Components Analysis”, • %www.cs.otago.ac.nz/cosc453/student_tutorials/principal_components.pdf• %---------Step1--get some data------------------• function test• x=[2.5 0.5 2.2 1.9 3.1 2.3 2 1 1.5 1.1]' %column vector• y=[2.4 0.7 2.9 2.2 3.0 2.7 1.6 1.1 1.6 0.9]' %column vector• N=length(x)• %---------Step2--subtract the mean------------------• mean_x=mean(x), mean_y=mean(y)• x_adj=x-mean_x,y_adj=y-mean_y %data adjust for x,y• %---------Step3---cal. covariance matrix----------------• data_adj=[x_adj,y_adj]• cov_x=cov(data_adj)• %---------Step4---cal. eignvector and eignecalues of cov_x-------------• [eigvect,eigval]=eig(cov_x)• eigval_1=eigval(1,1), eigval_2=eigval(2,2)• eigvect_1=eigvect(:,1),eigvect_2=eigvect(:,2),• • %eigvector1_length is 1, so the eigen vector is a unit vector• eigvector1_length=sqrt(eigvect_1(1)^2+eigvect_1(2)^2)• eigvector2_length=sqrt(eigvect_2(1)^2+eigvect_2(2)^2)• • %sorted, big eigen_vect(big eignval first)• %P_full=[eigvect(1,2),eigvect(2,2);eigvect(1,1),eigvect(2,1)]• P_full=[eigvect_2';eigvect_1'] %1st eigen vector is small,2nd is large • P_approx=[eigvect_2';[0,0]]%keep (2nd) big eig vec only,small gone• figure(1), clf, hold on• plot(-1,-1) %create the same diagram as in fig.3.1 of[smith 2002].• plot(4,4), plot([-1,4],[0,0],'-'),plot([0,0],[-1,4],'-')• hold on• title('PCA demo')• %step5: select feature• %eigen vectors,length of the eigen vector proportional to its eigen val• plot([0,eigvect(1,1)*eigval_1],[0,eigvect(2,1)*eigval_1],'b-')%1stVec• plot([0,eigvect(1,2)*eigval_2],[0,eigvect(2,2)*eigval_2],'r-')%2ndVec• title('eign vector 2(red) is much longer (bigger eigen value), so keep it')• • plot(x,y,'bo') %original data• %%full %%%%%%%%%%%%%%%%%%%%%%%%
%recovered_data_full=P_full*data_adj+repmat([mean_x;mean_y],1,N)• final_data_full=P_full*data_adj'• • recovered_data_full=P_full'*final_data_full+repmat([mean_x;mean_y],1,N)• %recovered_data_full=P_full*data_adj'+repmat([mean_x;mean_y],1,N)• plot(recovered_data_full(1,:),recovered_data_full(2,:),'r+')• • %%approx %%%%%%%%%%%%%%%%• %recovered_data_full=P_full*data_adj+repmat([mean_x;mean_y],1,N)• final_data_approx=P_full*data_adj'• • recovered_data_approx=P_approx'*final_data_approx+repmat([mean_x;mean_y],1,N)• %recovered_data_full=P_full*data_adj'+repmat([mean_x;mean_y],1,N)• plot(recovered_data_approx(1,:),recovered_data_approx(2,:),'gs')
•
Face recognition & detection using PCA v.4a
51
Result of pca_test1.m
•
Face recognition & detection using PCA v.4a
52
Green squares are compressed data using the Eigen vector 2 as the only basis (axis).
A short proof of PCA (principal component analysis)
• This proof is not vigorous, the detailed proof can be found in [Shlens 2005] .• Objective:
– We have an input data set X with zero mean and would like to transform X to Y (Y=PX, where P is the transformation) in a coordinate system that Y varies more in principal (or major) components than other components.
• E.g. X is in a 2 dimension space (x1,y1), after transforming X into Y (coordinates y1,y2) , data in Y mainly vary on y1-axis and little on y2-axis.
• That is to say, we want to find P so that covariance of Y (cov_Y=[1/(n-1)]YYT) is a diagonal matrix , because diagonal matrix has only elements in its diagonal and shows that the coordinates of Y has no correlation. n is used for normalization.
Face recognition & detection using PCA v.4a
53
Continue• Given X (with zero mean), we want to show that for Y=PX, if each row pi is an eigenvector of XXT,
then the covariance of Y is a diagonal matrix.• Proof: • For the covariance matrix (cov_Y) of Y• cov_Y=[1/(n-1)]YYT , (n is a normalization factor=length of X or Y.), put Y=PX• cov_Y=[1/(n-1)](PX)(PX)T • cov_Y=[1/(n-1)](PX)XTPT • cov_Y=[1/(n-1)]P(XXT)PT------(1)• From theorems 3,4 in appendix of [Shlens 2005]
– For (XXT)=PTDP, if in P, each row pi is an eigenvector of XXT, then D is a diagonal matrix. Put this in (1)• cov_Y=[1/(n-1)]P(PTDP)PT
• cov_Y=[1/(n-1)]D, so covariance of Y (cov_Y) is a diagonal matrix. Meaning that coordinates in Y has no correlation.
• We showed that for P if each row of pi is an eigenvector of XXT • Covariance of Y is a diagonal matrix.• (Done)!
Face recognition & detection using PCA v.4a
54
Appendix
Face recognition & detection using PCA v.4a
55
Covariance i,j and covariance matrix (reference: http://en.wikipedia.org/wiki/Covariance_matrix)
•
nnnnnnnn
nn
nn
ii
jjiijiij
n
XXEXXEXXE
XXEXXEXXE
XXEXXEXXE
XE
where
XXEXX
X
X
...
::::
...
...
meanalueexpected_v)(
),cov(
:X
2211
2222221122
1122111111
1
Face recognition & detection using PCA v.4a
56
Appendixtest_cov.m
• %test_cov• clear• x=[2.5 0.5 2.2 1.9 3.1 2.3 2 1 1.5 1.1]' • y=[2.4 0.7 2.9 2.2 3.0 2.7 1.6 1.1 1.6 0.9]' • x_adj=x-mean(x),y_adj=y-mean(y)• cov_xy=0,cov_xx=0,cov_yy=0,N=length(x)• for (i=1:N)• cov_xx=x_adj(i)*x_adj(i)+cov_xx• cov_xy=x_adj(i)*y_adj(i)+cov_xy• cov_yy=y_adj(i)*y_adj(i)+cov_yy• end• 'direct calculate'• c_N_minus_1_as_denon= [cov_xx/(N-1) cov_xy/(N-1) ; cov_xy/(N-1) cov_yy/(N-1)]• c_N_as_denon= [cov_xx/(N) cov_xy/(N) ; cov_xy/(N) cov_yy/(N)]• 'using matlab cov function to calculate'• matlab_cov_xy=cov(x,y)• matlab_cov_xy1=cov(x,y,1)•
Face recognition & detection using PCA v.4a
57
Result:matlab_cov_xy =
0.6166 0.6154 0.6154 0.7166matlab_cov_xy1 =
0.5549 0.5539 0.5539 0.6449
Appendix: Eigen value and eigen vector
• AN*NXX*1=1x1XX*1
• A is a square matrix, X is a vector, is a scalar• A is a transformation that does not change the
direction of X but only its length
Face recognition & detection using PCA v.4a
58
Exercise • Find PCA of • Original data x01=[2.5 0.5 2.2 1.9]'• Original data x02=[2.4 0.7 2.9 2.2]‘• Answer:• Number of samples N=4 • Mean of x01= (2.5+0.5 +2.2 +1.9 )/4=1.775• Mean of x02= (2.4+0.7+2.9+2.2)/4=2.05• Mean adjusted data• x1=[2.5-1.775 0.5-1.775 2.2-1.775 1.9-1.775]’• x2=[2.4-2.05 0.7-2.05 2.9-2.05 2.2-2.05]’
• x1=[ 0.7250 -1.2750 0.4250 0.1250]’• x2=[ 0.3500 -1.3500 0.8500 0.1500]’
Face recognition & detection using PCA v.4a
59
Continue
• Means adjusted data• Data4x2=[x1 x2]= [ 0.7250 -1.2750 0.4250 0.1250• 0.3500 -1.3500 0.8500 0.1500]’• And N-1=3• Covariance matrix of • c11= (0.725^2+1.275^2+0.425^2+0.125^2)/3= 0.7825• c12= (0.725*0.35+1.275*1.35+0.425*0.85+0.125*0.15)/3=0.7850• c21= (0.725*0.35+1.275*1.35+0.425*0.85+0.125*0.15)/3=0.7850• c22= (0.35^2+1.35^2+0.85^2+0.15^2)/3= 0.8967• C=COV[Data]=[0.7825 0.7850• 0.7850 0.8967]
Face recognition & detection using PCA v.4a
60
N
iii
N
iii
N
iii
N
iii
NXxXxNXxXx
NXxXxNXxXx
cc
cc
CDataCOV
11,21,2
11,11,2
11,21,1
11,11,1
2221
1211
)1/()1/(
)1/()1/(
][_matrixcovariance
Continue• Eigen(C)=• [v,e]=eig(C)• v =• -0.7323 0.6810• 0.6810 0.7323• e =• 0.0525 0• 0 1.6267• Which Eigen vector is the principal axis?• The second one [0.6810 0.7323]’ since the second Eigen value 1.6267 is• bigger
Face recognition & detection using PCA v.4a
61
Test_ex1.m• %pca_test1.m, example using data in [smith 2002] LI Smith,%matlab by khwong• %"A tutorial on Principal Components Analysis”, • %www.cs.otago.ac.nz/cosc453/student_tutorials/principal_components.pdf• %---------Step1--get some data------------------• function test• x=[2.5 0.5 2.2 1.9]' %column vector• y=[2.4 0.7 2.9 2.2]' %column vector• N=length(x)• %---------Step2--subtract the mean------------------• mean_x=mean(x), mean_y=mean(y)• x_adj=x-mean_x,y_adj=y-mean_y %data adjust for x,y• %---------Step3---cal. covariance matrix----------------• data_adj=[x_adj,y_adj]• cov_x=cov(data_adj)• %---------Step4---cal. eignvector and eignecalues of cov_x-------------• [eigvect,eigval]=eig(cov_x)• eigval_1=eigval(1,1), eigval_2=eigval(2,2)• eigvect_1=eigvect(:,1),eigvect_2=eigvect(:,2),• • %eigvector1_length is 1, so the eigen vector is a unit vector• eigvector1_length=sqrt(eigvect_1(1)^2+eigvect_1(2)^2)• eigvector2_length=sqrt(eigvect_2(1)^2+eigvect_2(2)^2)• • %sorted, big eigen_vect(big eignval first)• %P_full=[eigvect(1,2),eigvect(2,2);eigvect(1,1),eigvect(2,1)]• P_full=[eigvect_2';eigvect_1'] %1st eigen vector is small,2nd is large • P_approx=[eigvect_2';[0,0]]%keep (2nd) big eig vec only,small gone• figure(1), clf, hold on• plot(-1,-1) %create the same diagram as in fig.3.1 of[smith 2002].• plot(4,4), plot([-1,4],[0,0],'-'),plot([0,0],[-1,4],'-')• hold on• title('PCA demo')• %step5: select feature• %eigen vectors,length of the eigen vector proportional to its eigen val• plot([0,eigvect(1,1)*eigval_1],[0,eigvect(2,1)*eigval_1],'b-')%1stVec• plot([0,eigvect(1,2)*eigval_2],[0,eigvect(2,2)*eigval_2],'r-')%2ndVec• title('eign vector 2(red) is much longer (bigger eigen value), so keep it')• • plot(x,y,'bo') %original data• %%full %%%%%%%%%%%%%%%%%%%%%%%% %recovered_data_full=P_full*data_adj+repmat([mean_x;mean_y],1,N)• final_data_full=P_full*data_adj'• • recovered_data_full=P_full'*final_data_full+repmat([mean_x;mean_y],1,N)• %recovered_data_full=P_full*data_adj'+repmat([mean_x;mean_y],1,N)• plot(recovered_data_full(1,:),recovered_data_full(2,:),'r+')• • %%approx %%%%%%%%%%%%%%%%• %recovered_data_full=P_full*data_adj+repmat([mean_x;mean_y],1,N)• final_data_approx=P_full*data_adj'• • recovered_data_approx=P_approx'*final_data_approx+repmat([mean_x;mean_y],1,N)• %recovered_data_full=P_full*data_adj'+repmat([mean_x;mean_y],1,N)• plot(recovered_data_approx(1,:),recovered_data_approx(2,:),'gs')• • 'test--final_data_full*final_data_full^T'• final_data_full• final_data_full*final_data_full'• 'test--final_data_approx*final_data_approx^T'• final_data_approx• final_data_approx*final_data_approx'• 'test--data_adj*data_adj^T'• data_adj'• data_adj'*data_adj
Face recognition & detection using PCA v.4a
62
PCA numerical example (pca_test1.m, in appendix):Step1
• Create the testing data
Face recognition & detection using PCA v.4a
63
Xo1= 2.50000.50002.20001.90003.10002.30002.00001.00001.50001.1000
xo2=2.40000.70002.90002.20003.00002.70001.60001.10001.60000.9000
x1
x2
mean is NOT (0,0)
Example step2a: C=2, Adjust data, find
covariance matrix
Step2:• X_data_adj =• X=Xo-mean(Xo)=• [0.6900 0.4900• -1.3100 -1.2100• 0.3900 0.9900• 0.0900 0.2900• 1.2900 1.0900• 0.4900 0.7900• 0.1900 -0.3100• -0.8100 -0.8100• -0.3100 -0.3100• -0.7100 -1.0100]
N
iii
N
iii
N
iii
N
iii
NXxXxNXxXx
NXxXxNXxXx
11,21,2
11,11,2
11,21,1
11,11,1
)1/()1/(
)1/()1/(
_matrixcovariance
N
iii NXxXx
11,11,1 )1/(
Face recognition & detection using PCA v.4a
64
X1 X2
(0.69*0.69+(-1.31)*(-1.31)+0.39*0.39+0.09*0.09+1.29*1.29+0.49*0.49+0.19*0.19+(-0.81)*(-0.81)+(-0.31)*(-0.31)+(-0.71)*(-0.71))/(10-1)=0.6166
mean is adjusted to (0,0)
Example step2b
Step2:• X_data_adj =• X=Xo-mean(Xo)=• [0.6900 0.4900• -1.3100 -1.2100• 0.3900 0.9900• 0.0900 0.2900• 1.2900 1.0900• 0.4900 0.7900• 0.1900 -0.3100• -0.8100 -0.8100• -0.3100 -0.3100• -0.7100 -1.0100]
N
iii
N
iii
N
iii
N
iii
NXxXxNXxXx
NXxXxNXxXx
11,21,2
11,11,2
11,21,1
11,11,1
)1/()1/(
)1/()1/(
_matrixcovariance
N
iii NXxXx
12,21,1 )1/(
Face recognition & detection using PCA v.4a
65
X1 X2
(0.69*0.49+(-1.31)*(-1.21)+0.39*0.99+0.09*0.29+1.29*1.09+0.49*0.79+0.19*-0.31+(-0.81)*(-0.81)+(-0.31)*(-0.31)+(-0.71)*(-1.01))/(10-1)=0.6154
mean is adjusted to (0,0)
Example step2c
Step2:• X_data_adj =• X=Xo-mean(Xo)=• [0.6900 0.4900• -1.3100 -1.2100• 0.3900 0.9900• 0.0900 0.2900• 1.2900 1.0900• 0.4900 0.7900• 0.1900 -0.3100• -0.8100 -0.8100• -0.3100 -0.3100• -0.7100 -1.0100]
N
iii
N
iii
N
iii
N
iii
NXxXxNXxXx
NXxXxNXxXx
11,21,2
11,11,2
11,21,1
11,11,1
)1/()1/(
)1/()1/(
_matrixcovariance
N
iii NXxXx
12,21,2 )1/(
Face recognition & detection using PCA v.4a
66
X1 X2
(0.49*0.49+(-1.21)*(-1.21)+0.99*0.99+0.29*0.29+1.09*1.09+0.79*0.79+(-0.31*-0.31)+(-0.81)*(-0.81)+(-0.31)*(-0.31)+(-1.01)*(-1.01))/(10-1)= 0.7166
Hence:Covariance_matrix of X =cov_x (2 rows by 2 columns) =
[0.6166 0.6154 ] [ 0.6154 0.7166 ]
mean is adjusted to (0,0)
Class Exercise 2• %covariance test• x=[1 3 -2 -2]'• y=[ 4 6 -3 -7]'• mean(x)=0 and mean(y)=0 to simplify calculation. Find c=cov(x,y), • Cov(x,y)=1/3*• [(1^2+3^2+(-2)^2 + (-2)^2) (1*4+3*6+(-2*-3)+-2*-7)• (1*4+3*6+(-2*-3)+-2*-7) (4^2+6^2+(-3)^2+(-7)^2)]• =(1/3)*[18 42• 42 110]• =[6 14• 14 36.6667
Face recognition & detection using PCA v.4a
67
Answer3 Exercise 2• %covariance test• x=[1 3 -2 -2]'• y=[ 4 6 -3 -7]'• mean(x)=0 and mean(y)=0 to simplify calculation. Find c=cov(x,y), • Cov(x,y)=1/3*• [(1^2+3^2+(-2)^2 + (-2)^2) (1*4+3*6+(-2*-3)+-2*-7)• (1*4+3*6+(-2*-3)+-2*-7) (4^2+6^2+(-3)^2+(-7)^2)]• =(1/3)*[18 42• 42 110]• =[6 14• 14 36.6667
Face recognition & detection using PCA v.4a
68