shape analysis and retrieval statistical shape descriptors notes courtesy of funk et al., siggraph...
TRANSCRIPT
Shape Analysis and Retrieval
Statistical Shape Descriptors
Notes courtesy of Funk et al., SIGGRAPH 2004
Outline:• Shape Descriptors
• Statistical Shape Descriptors
• Singular Value Decomposition (SVD)
Shape MatchingGeneral approach:
Define a function that takes in two models and returns a measure of their proximity.
D , D ,M1 M1 M3M2
M1 is closer to M2 than it is to M3
Shape DescriptorsShape Descriptor:
A structured abstraction of a 3D model that is well suited to the challenges of shape matching
Descriptors
3D Models
D ,
D ,
Matching with DescriptorsPreprocessingCompute database descriptors
Run-Time
3D Query ShapeDescriptor
3D Database
BestMatches
Matching with DescriptorsPreprocessingCompute database descriptors
Run-TimeCompute query descriptor
3D Query ShapeDescriptor
3D Database
BestMatches
Matching with DescriptorsPreprocessingCompute database descriptors
Run-TimeCompute query descriptorCompare query descriptor to database descriptors
3D Query ShapeDescriptor
3D Database
BestMatches
Matching with DescriptorsPreprocessingCompute database descriptors
Run-TimeCompute query descriptorCompare query descriptor to database descriptorsReturn best Match(es)
3D Query ShapeDescriptor
3D Database
BestMatches
Shape Matching ChallengeNeed shape descriptor that is:Concise to store
– Quick to compute
– Efficient to match
– Discriminating
3D Query ShapeDescriptor
3D Database
BestMatches
Shape Matching ChallengeNeed shape descriptor that is:
– Concise to storeQuick to compute
– Efficient to match
– Discriminating
3D Query ShapeDescriptor
3D Database
BestMatches
Shape Matching ChallengeNeed shape descriptor that is:
– Concise to store
– Quick to computeEfficient to match
– Discriminating
3D Query ShapeDescriptor
3D Database
BestMatches
Shape Matching ChallengeNeed shape descriptor that is:
– Concise to store
– Quick to compute
– Efficient to matchDiscriminating
3D Query ShapeDescriptor
3D Database
BestMatches
Shape Matching ChallengeNeed shape descriptor that is:
– Concise to store
– Quick to compute
– Efficient to match
– Discriminating Invariant to transformations
– Invariant to deformations
– Insensitive to noise
– Insensitive to topology
– Robust to degeneracies
Different Transformations(translation, scale, rotation, mirror)
Shape Matching ChallengeNeed shape descriptor that is:
– Concise to store
– Quick to compute
– Efficient to match
– Discriminating
– Invariant to transformations Invariant to deformations
– Insensitive to noise
– Insensitive to topology
– Robust to degeneraciesDifferent Articulated Poses
Shape Matching ChallengeNeed shape descriptor that is:
– Concise to store
– Quick to compute
– Efficient to match
– Discriminating
– Invariant to transformations
– Invariant to deformations Insensitive to noise
– Insensitive to topology
– Robust to degeneracies
Scanned Surface
Image courtesy ofRamamoorthi et al.
Shape Matching ChallengeNeed shape descriptor that is:
– Concise to store
– Quick to compute
– Efficient to match
– Discriminating
– Invariant to transformations
– Invariant to deformations
– Insensitive to noise Insensitive to topology
– Robust to degeneraciesDifferent Tessellations
Different Genus
Images courtesy of Viewpoint & Stanford
Shape Matching ChallengeNeed shape descriptor that is:
– Concise to store
– Quick to compute
– Efficient to match
– Discriminating
– Invariant to transformations
– Invariant to deformations
– Insensitive to noise
– Insensitive to topologyRobust to degeneracies
No Bottom!
&*Q?@#A%!
Images courtesy of Utah & De Espona
Outline:• Shape Descriptors
• Statistical Shape Descriptors
• Singular Value Decomposition (SVD)
Statistical Shape Descriptors
Challenge:Want a simple shape descriptor that is easy to compare and gives a continuous measure of the similarity between two models.
Solution:Represent each model by a vector and define the distance between models as the distance between corresponding vectors.
Statistical Shape Descriptors
Properties:– Structured representation– Easy to compare– Generalizes the
matching problem
Models represented as points in a fixed dimensional vector space
Statistical Shape Descriptors
General Approaches:– Retrieval– Clustering– Compression– Hierarchical
representation
Models represented as points in a fixed dimensional vector space
Outline:• Shape Descriptors
• Statistical Shape Descriptors
• Singular Value Decomposition (SVD)
Complexity of Retrieval
Given a query:• Compute the distance to each database model• Sort the database models by proximity• Return the closest matches
~
Best Match(es)
3D Query
Database Models Sorted Models
D(Q,Mi
)Q
M1
M2
Mk
M1
M2
Mk
~
~
~
ji
MQDMQD ji
)~
,()~
,(
M1~
M2
Complexity of Retrieval
If there are k models in the database and each model is represented by an n-dimensional vector:
• Computing the distance to each database model:– O(k n) time
• Sort the database models by proximity:– O(k logk) time
If n is large, retrieval will be prohibitively slow.
Algebra
Definition: Given a vector space V and a subspace WV, the projection onto W, written W, is the map that sends vV to the nearest vector in W.
If {w1,…,wm} is an orthonormal basis for W, then:
i
m
iiW wwvv
1
,)(
Tensor Algebra
Definition: The inner product of two n-dimensional vectors v={v1,…,vn} and w={w1,…,wn}, written v,w, is the scalar value defined by:
wv
w
w
vvwvwv t
n
nn
i
ii
1
1
1
,
Tensor Algebra
Definition: The outer product of two n-dimensional vectors v={v1,…,vn} and w={w1,…,wn}, written vw, is the matrix defined by:
tn
n
vwww
v
v
wv
1
1
Tensor Algebra
Definition: The transpose of an mxn matrix M, written Mt, is the nxm matrix with:
Property: For any two vectors v and w, the transpose has the property:
ijjit MM ,,
wvMMwv t ,,
SVD Compression
Key Idea:Given a collection of vectors in n-dimensional space, find a good m-dimensional subspace (m<<n) in which to represent the vectors.
SVD Compression
Specifically:If P={p1,…,pk} is the initial n-dimensional point set, and {w1,…,wm} is an orthonormal basis for the m-dimensional subspace, we will compress the point set by sending:
mwpwpp ,,,, 1
SVD Compression
Challenge:To find the m-dimensional subspace that will best capture the initial point information.
Variance of the Point Set
Given a collection of points P={p1,…,pk}, in an n-dimensional vector space, determine how the vectors are distributedacross different directions. pi
p1
p2
pk
Variance of the Point Set
Define the VarP as the function:
giving the variance of thepoint set P in direction v(assume |v|=1).
pi
p1
p2
pk
k
iiv
k
iiP pvpvVar
1
2
1
2,)(
Variance of the Point Set
More generally, for a subspace WV, define the variance of P in the subspace W as:
If {w1,…,wm} is an orthonormal basis for W, then:
n
iiWP pWVar
1
2)(
m
jjP
n
i
m
jjiP wVarwpWVar
11 1
2)(,)(
Variance of the Point Set
Example:
The variance in the direction v1
is large, while the variance inthe direction v2 is small.
If we want to compressdown to one dimension, weshould project the points onto v1
pi
p2
p1
vv11
vv22
pk
Covariance Matrix
Definition: The covariance matrix MP, of a point set P={p1,…,pk} is the symmetric matrix which is the sum of the outer products of the pi:
k
i
tiiP ppM
1
Covariance Matrix
Theorem: The variance of the point set P in a direction v is equal to:
vMvvpvVar pt
k
iiP
1
2,)(
Covariance Matrix
Theorem: The variance of the point set P in a direction v is equal to:
Proof:
vMvvpvVar pt
k
iiP
1
2,)(
k
iPi
k
i
k
i
tii
ttii
t
p
n
i
tii
tp
t
vVarpv
vppvvppv
vppvvMv
1
2
1 1
1
,
Singular Value Decomposition
Theorem: Every symmetric matrix M can be written out as the product:
where O is a rotation/reflection matrix (OOt=Id) and D is a diagonal matrix with the property:
tODOM
n
n
D
21
1
with
0
0
Singular Value Decomposition
Implication:Given a point set P, we can compute the covariance matrix of the point set, MP, and express the matrix in terms of its SVD factorization:
where {v1,…,vn} is an orthonormal basis and i is the variance of the point set in direction vi.
ntn
t
n
nP
v
v
vvM
21
11
1 with
0
0
Singular Value Decomposition
Compression:
The vector subspace spanned by {v1,…,vm} is the vector sub-space that maximizes the variance in the initial point set P.
If m is too small, then too much information is discarded and there will be a loss in retrieval accuracy.
Singular Value Decomposition
Hierarchical Matching:First coarsely compare the query to database vectors.• If {query is coarsely similar to target}
– Refine the comparison
• Else– Do not refine
O(k n) matching becomes O(k m) with m<<n and no loss of retrieval accuracy.
Singular Value Decomposition
Hierarchical Matching:
SVD expresses the initial vectors in terms of the eigenbasis:
Because there is more variance in v1 than in v2, more variance in v2 than in v3, etc. this gives a hierarchical representation of the data so that coarse comparisons can be performed by comparing only the first m coefficients.
nn vvpvvpp ,, 11
Efficient to match?
Preprocessing:• Compute SVD factorization• Transform database descriptors
Run-Time:
3. Transform Query
SVD
SVD
Query
Efficient to match?
4. Low resolution sort
0.040 0.052 0.103 0.6610.430
Distance to Query
Query Database
Efficient to match?
5. Update closest matches
6. ResortQuery
0.229 0.052 0.103 0.6610.430
Database
Distance to Query
Efficient to match?
5. Update closest matches
6. ResortQuery
0.2290.141 0.103 0.6610.430
Database
Distance to Query
Efficient to match?
5. Update closest matches
6. ResortQuery
0.2290.1410.189 0.6610.430
Database
Distance to Query
Efficient to match?
5. Update closest matches
6. ResortQuery
0.2290.230 0.189 0.6610.430
Database
Distance to Query
Efficient to match?
5. Update closest matches
6. ResortQuery
0.229 0.2300.200 0.6610.430
Database
Distance to Query
Efficient to match?
5. Update closest matches
6. ResortQuery
0.229 0.230=0.289 0.6610.430
Database
Distance to Query
Efficient to match?
5. Update closest matches
6. ResortQuery
0.334 0.230 =0.289 0.6610.430
Database
Distance to Query
Efficient to match?
5. Update closest matches
6. ResortQuery
0.334=0.301 =0.289 0.6610.430
Database
Distance to Query
Efficient to match?
5. Update closest matches
6. ResortQuery
0.334=0.301=0.289 0.6610.430
Database
Distance to Query
Singular Value Decomposition
Theorem: Every symmetric matrix M can be written out as the product:
where O is a rotation/reflection matrix (OOt=Id) and D is a diagonal matrix with the property:
tODOM
n
n
D
21
1
with
0
0
Singular Value Decomposition
Proof:
1. Every symmetric matrix has at least one real eigenvector v.
2. If v is an eigenvector and w is perpendicular to v then Mw is also perpendicular to v. v
wvSince M maps the subspace of vectors perpendicular to v back into itself, we can look at the restriction of M to the subspace and iterate to get the next eigenvector.
Singular Value Decomposition
Proof (Step 1):
Let F(v) be the function on the unit sphere (||v||=1) defined by:
vv
FF((vv))
MvvvF t)(
Singular Value Decomposition
Proof (Step 1):
Let F(v) be the function on the unit sphere (||v||=1) defined by:
Then F must have a maximumat some point v0. vv00
FF((vv00))MvvvF t)(
Singular Value Decomposition
Proof (Step 1):
Let F(v) be the function on the unit sphere (||v||=1) defined by:
Then F must have a maximumat some point v0.
Then F(v0)=v0.
MvvvF t)(
vv00
FF((vv00))
Singular Value Decomposition
If F has a maximum at some point v0 then F(v0)=v0.If w0 is on the sphere, next to v0, then w0-v0 is nearly perpendicular to v0. And for any small vector w1 perpendicular to v0, v0+ w1 is nearly on the sphere
vv00
ww00
vv00ww
00 -v-v00
ww00
Singular Value Decomposition
If F has a maximum at some point v0 then F(v0)=v0.For small values of w0 close to v0, we have:
For v0 to be a maximum, we must have:
for all w0 near v0.
Thus, F(v0) must be perpendicular to all vectors that are perpendicular to v0, and hence must itself be a multiple of v0.
00000 ),()()( vwvFvFwF
0),( 000 vwvF
Singular Value Decomposition
Proof (Step 1):
Let F(v) be the function on the unit sphere (||v||=1) defined by:
Then F must have a maximumat some point v0.
Then F(v0)=v0.
MvvvF t)(
vv00
FF((vv00))
Singular Value Decomposition
Proof (Step 1):
Let F(v) be the function on the unit sphere (||v||=1) defined by:
Then F must have a maximumat some point v0.
Then F(v0)=v0.
But F(v)=2Mv
MvvvF t)(
vv00
FF((vv00))
v0 is an eigenvector of M.
Singular Value Decomposition
Proof:
1. Every symmetric matrix has at least one eigenvector v.
2. If v is an eigenvector and w is perpendicular to v then Mw is also perpendicular to v.
Singular Value Decomposition
Proof (Step 2):
If w is perpendicular to v, then v,w=0.
Since M is symmetric:
so that Mw is also perpendicular to v.
0,,, wvwMvMwv