spectral algorithms i
TRANSCRIPT
![Page 1: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/1.jpg)
Spectral Algorithms I
Slides based on “Spectral Mesh Processing” Siggraph 2010 course
![Page 2: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/2.jpg)
Why Spectral?Why Spectral?
A different way to look at functions on a domainA different way to look at functions on a domain
![Page 3: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/3.jpg)
Why Spectral?Why Spectral?
Better representations lead to simpler solutionsBetter representations lead to simpler solutions
![Page 4: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/4.jpg)
A Motivating ApplicationShape Correspondence
• Rigid alignment easy ‐different pose?
• Spectral transform normalizes• Spectral transform normalizes shape pose
nmen
tRigid align
![Page 5: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/5.jpg)
Spectral Geometry ProcessingSpectral Geometry Processing
Use eigen‐structureof “well behaved” linear operators
for geometry processing
![Page 6: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/6.jpg)
Eigen‐structureEigen structure
• Eigenvectors and eigenvalues Au = λu, u ≠ 0g g
• Diagonalization or
,
T
eigen‐decompositionA = UΛUT
• Projection into eigen‐subspace y’ = U(k)U(k)Ty
• DFT‐like spectral transform ŷ = UTy
![Page 7: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/7.jpg)
Eigen‐structureEigen structure
eigen‐decomposition
Λ= U UTA
eigen decomposition
subspace projection
y’ = U(3)U(3)Ty
projectionmesh geometry
=positive definite matrix A
y =
![Page 8: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/8.jpg)
Eigen‐structureEigen structure
eigen‐decomposition
Λ= U UTA
eigen decomposition
spectral transform
mesh geometry ŷ = UTy
transform
positive definite matrix A
y = = UT
![Page 9: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/9.jpg)
Classification of ApplicationsClassification of Applications
• Eigenstructure(s) used– Eigenvalues: signature for shape characterization– Eigenvectors: form spectral embedding (a transform)– Eigenprojection: also a transform ⎯ DFT‐like
• Dimensionality of spectral embeddings– 1D: mesh sequencing– 2D or 3D: graph drawing or mesh parameterization2D or 3D: graph drawing or mesh parameterization– Higher D: clustering, segmentation, correspondence
• Mesh operator used– Laplace Beltrami, distances matrix, other– Combinatorial vs. geometric– 1st‐order vs. higher order
N li d li d– Normalized vs. un‐normalized
![Page 10: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/10.jpg)
Operators?Operators?
• Best– Symmetric positive definite operator: xTAx > 0 for any x
• Can live withCan live with– Semi‐positive definite (xTAx ≥ 0 for any x)– Non symmetric, as long as eigenvalues are real and positive
e.g.: L = DW, where W is SPD and D is diagonal.e.g.: L DW, where W is SPD and D is diagonal.
• Beware ofNon square operators– Non‐square operators
– Complex eigenvalues– Negative eigenvalues
![Page 11: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/11.jpg)
Spectral Processing ‐ PerspectivesSpectral Processing Perspectives
• Signal processingS g a p ocess g– Filtering and compression– Relation to discrete Fourier transform (DFT)
• Geometric – Global and intrinsic
• Machine learning– Dimensionality reduction
![Page 12: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/12.jpg)
The smoothing problemThe smoothing problem
Smooth out rough features of a contour (2D shape)Smooth out rough features of a contour (2D shape)
![Page 13: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/13.jpg)
Laplacian smoothingLaplacian smoothing
Move each vertex towards the centroid of its neighboursMove each vertex towards the centroid of its neighbours
Here: • Centroid midpoint• Centroid = midpoint• Move half way
![Page 14: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/14.jpg)
Laplacian smoothing and LaplacianLaplacian smoothing and Laplacian
• Local averaging
• 1D discrete Laplacian
![Page 15: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/15.jpg)
Smoothing resultSmoothing result
• Obtained by 10 steps of Laplacian smoothingObtained by 10 steps of Laplacian smoothing
![Page 16: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/16.jpg)
Signal representationSignal representation
• Represent a contour using a discrete periodicRepresent a contour using a discrete periodic 2D signal
x coordinates of thex‐coordinates of the seahorse contour
![Page 17: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/17.jpg)
Laplacian smoothing in matrix formLaplacian smoothing in matrix form
x component onlyy treated same waySmoothing operator
![Page 18: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/18.jpg)
1D discrete Laplacian operator1D discrete Laplacian operator
Smoothing and Laplacian operatorSmoothing and Laplacian operator
![Page 19: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/19.jpg)
Spectral analysis of signal/geometrySpectral analysis of signal/geometry
Express signal X as a linear sum of eigenvectorsExpress signal X as a linear sum of eigenvectors
DFT‐like spectral transform
Project X along eigenvector
X
Spatial domain Spectral domain= ET
![Page 20: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/20.jpg)
Plot of eigenvectorsPlot of eigenvectors
First 8 eigenvectors of the 1D periodic Laplacian
More oscillation as eigenvalues (frequencies) increase
![Page 21: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/21.jpg)
Relation to Discrete Fourier TransformRelation to Discrete Fourier Transform
• Smallest eigenvalue of L is zeroSmallest eigenvalue of L is zero
• Each remaining eigenvalue (except for the last one when n is even) has
multiplicity 2multiplicity 2
• The plotted real eigenvectors are not unique to L
• One particular set of eigenvectors of L are the DFT basis
• Both sets exhibit similar oscillatory behaviours w.r.t. frequencies
![Page 22: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/22.jpg)
Reconstruction and compressionReconstruction and compression
• Reconstruction using k leading coefficientsReconstruction using k leading coefficients
=
• A form of spectral compression with info loss
=
A form of spectral compression with info loss given by L2
![Page 23: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/23.jpg)
Plot of spectral transform coefficientsPlot of spectral transform coefficients
• Fairly fast decay as eigenvalue increasesFairly fast decay as eigenvalue increases
![Page 24: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/24.jpg)
Reconstruction examplesReconstruction examples
n = 401
75n = 75
![Page 25: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/25.jpg)
Laplacian smoothing as filteringLaplacian smoothing as filtering
• Recall the Laplacian smoothing operatorRecall the Laplacian smoothing operator
• Repeated application of S
A filter applied to spectral coefficients
![Page 26: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/26.jpg)
ExamplesExamples
Filter:
m = 1 m = 5 m = 10 m = 50
![Page 27: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/27.jpg)
Computational issuesComputational issues
• No need to compute spectral coefficients for filtering• No need to compute spectral coefficients for filtering
– Polynomial (e.g., Laplacian): matrix‐vector multiplication
• Spectral compression needs explicit spectral transform
• Efficient computation [Levy et al 08]• Efficient computation [Levy et al. 08]
![Page 28: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/28.jpg)
Towards spectralmesh transformTowards spectral mesh transform
• Signal representationSignal representation
– Vectors of x, y, z vertex coordinates(x, y, z)
• Laplacian operator for meshes
– Encodes connectivity and geometry
– Combinatorial: graph Laplacians and variants
– Discretization of the continuous Laplace‐Beltrami operator
• The same kind of spectral transform and analysis
![Page 29: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/29.jpg)
Spectral Mesh CompressionSpectral Mesh Compression
![Page 30: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/30.jpg)
Spectral Processing ‐ PerspectivesSpectral Processing Perspectives
• Signal processingS g a p ocess g– Filtering and compression– Relation to discrete Fourier transform (DFT)
• Geometric – Global and intrinsic
• Machine learning– Dimensionality reduction
![Page 31: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/31.jpg)
A geometric perspective: classicalA geometric perspective: classical
Cl i l E lid tClassical Euclidean geometry
– Primitives not represented in coordinatesPrimitives not represented in coordinates
– Geometric relationships deduced in a
pure and self‐contained manner
Use of axioms– Use of axioms
![Page 32: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/32.jpg)
A geometric perspective: analyticA geometric perspective: analytic
D t ’ l ti tDescartes’ analytic geometry
– Algebraic analysis toolsAlgebraic analysis tools
introduced
– Primitives referenced in global
frame⎯ extrinsic approachframe extrinsic approach
![Page 33: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/33.jpg)
Intrinsic approachIntrinsic approach
Riemann’s intrinsic view of geometry
– Geometry viewed purely from the surfaceGeometry viewed purely from the surface
perspective
– Metric: “distance” between points on surface
– Many spaces (shapes) can be treatedMany spaces (shapes) can be treated
simultaneously: isometry
![Page 34: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/34.jpg)
Spectral methods: intrinsic viewSpectral methods: intrinsic view
l h k hSpectral approach takes the intrinsic view
– Intrinsic geometric/mesh information captured via a
linear mesh operator
– Eigenstructures of the operator present the intrinsicEigenstructures of the operator present the intrinsic
geometric information in an organized manner
Rarely need all eigenstructures dominant ones often– Rarely need all eigenstructures, dominant ones often
suffice
![Page 35: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/35.jpg)
Capture of global informationCapture of global information
(Courant‐Fisher) Let S ∈ℜn × n be a symmetric matrix. Then its eigenvalues λ1≤ λ2 ≤ …. ≤ λn must satisfy the following,
min T
11,01
T2
vvvv
vS
i-k
i
k ≤≤∀==
=λ
where v1, v2, …, vi – 1 are eigenvectors of S corresponding to the smallest eigenvalues λ1, λ2 , …, λi – 1, respectively.
11 ,0vv ikk ≤≤∀
![Page 36: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/36.jpg)
InterpretationInterpretation
i TSλvv
T
TS
min T
11 ,01
T2
vvvv
vS
i-k
i
k ≤≤∀==
=λ vvT
Rayleigh quotient
• Smallest eigenvector minimizes the Rayleigh quotient
• k‐th smallest eigenvector minimizes Rayleigh quotient, among the vectors
orthogonal to all previous eigenvectors
S l i l b l ti i ti bl• Solutions to global optimization problems
![Page 37: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/37.jpg)
Use of eigenstructuresUse of eigenstructures
• EigenvaluesEigenvalues– Spectral graph theory: graph eigenvalues closely related to almost all major global graph invariants
– Have been adopted as compact global shape descriptors
• Eigenvectors– Useful extremal properties, e.g., heuristic for NP‐hard problems normalized cuts and sequencingproblems ⎯ normalized cuts and sequencing
– Spectral embeddings capture global information, e.g., clustering
![Page 38: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/38.jpg)
Example: clustering problemExample: clustering problem
![Page 39: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/39.jpg)
Example: clustering problemExample: clustering problem
![Page 40: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/40.jpg)
Spectral clusteringSpectral clustering
Encode information about pairwise point affinities
…
Operator A Leading eigenvectors
Input dataeigenvectors
2
2
2σji pp
ij eA−
−=
Spectral embeddingSpectral embedding
![Page 41: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/41.jpg)
Spectral clusteringSpectral clustering
…
eigenvectors In spectral domain Perform any clustering eigenvectors In spectral domain
(e.g., k‐means) in spectral domain
![Page 42: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/42.jpg)
Why does it work this way?Why does it work this way?
Li k b dLinkage‐based (local info.)
spectral domain
Spectral clustering
![Page 43: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/43.jpg)
Local vs. global distancesLocal vs. global distances
• A good distance: Points in same
cluster closer in transformed
domain
• Look at set of shortest paths ⎯more global
• Commute time distance cij =
expected time for random walk to
go from i to j and then back to ig jWould be nice to cluster according to cij
![Page 44: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/44.jpg)
Local vs. global distancesLocal vs. global distances
In spectral domain
![Page 45: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/45.jpg)
Commute time and spectralCommute time and spectral
• Eigen‐decompose the graph Laplacian KEigen decompose the graph Laplacian K
K = UΛUT
• Let K’ be the generalized inverse of K,
K’ UΛ’UTK’ = UΛ’UT,
Λ’ii = 1/Λii if Λii ≠ 0, otherwise Λ’ii = 0.
• Note: the Laplacian is singular
![Page 46: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/46.jpg)
Commute time and spectralCommute time and spectral
• Let zi be the i‐th row of UΛ’ 1/2 ⎯ the spectral embeddingi p g
– Scaling each eigenvector by inverse square root of eigenvalue
• Then
||zi – zj||2 = cij
the commute time distance
[Klein & Randic 93, Fouss et al. 06]
• Full set of eigenvectors used, but select first k in practice
![Page 47: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/47.jpg)
Example: intrinsic geometryExample: intrinsic geometry
Our first example: correspondence Spectral transform to handle shape pose
alignm
ent
Rigid
![Page 48: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/48.jpg)
Spectral Processing ‐ PerspectivesSpectral Processing Perspectives
• Signal processingS g a p ocess g– Filtering and compression– Relation to discrete Fourier transform (DFT)
• Geometric – Global and intrinsic
• Machine learning– Dimensionality reduction
![Page 49: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/49.jpg)
Spectral embeddingSpectral embedding
• Spectral decomposition A = UΛUTSpectral decomposition
• Full spectral embedding given by scaled eigenvectors (each scaled by
squared root of eigenvalue) completely captures the operator
A = UΛU
squared root of eigenvalue) completely captures the operator
W W T = 21
Λ=UWA
![Page 50: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/50.jpg)
Dimensionality reductionDimensionality reduction
• Full spectral embedding is high‐dimensionalFull spectral embedding is high dimensional
• Use few dominant eigenvectors ⎯ dimensionality
reductionreduction
– Information‐preserving
St t h t (P l i ti Th )
…
– Structure enhancement (Polarization Theorem)
– Low‐D representation: simplifying solutions
![Page 51: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/51.jpg)
Eckard & Young: Info‐preservingEckard & Young: Info preserving
• A ∈ℜn × n : symmetric and positive semi‐definiteA ∈ℜ : symmetric and positive semi definite
• U(k) ∈ℜn × k : leading eigenvectors of A, scaled by square root of
eigenvalues
• Then U(k)U(k)T: best rank‐k approximation of A in Frobenius norm
U =U(k)
![Page 52: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/52.jpg)
Brand & Huang: Polarization TheoremBrand & Huang: Polarization Theorem
![Page 53: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/53.jpg)
Low‐dim→ simpler problemsLow dim → simpler problems
• Mesh projected into the eigenspace formed by the first two eigenvectors of a
mesh Laplacian
• Reduce 3D analysis to contour analysis [Liu & Zhang 07]
![Page 54: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/54.jpg)
Challenges ‐ Not quite DFTChallenges Not quite DFT
• Basis for DFT is fixed given n, e.g., regular and easy to compare (Fourier g , g , g y p (
descriptors)
• Spectral mesh transform is operator‐
operator‐dependent
Different behavior of eigen‐
functions on the same sphere
Which operator to use?
![Page 55: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/55.jpg)
Challenges ‐ No free lunchChallenges No free lunch
N h L l i l h ti f li t f ll d i bl• No mesh Laplacian on general meshes can satisfy a list of all desirable
properties
• Remedy: use nice meshes Delaunay or non‐obtuse• Remedy: use nice meshes ⎯ Delaunay or non‐obtuse
Non‐obtuseDelaunay but obtuse Non‐obtuseDelaunay but obtuse
![Page 56: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/56.jpg)
Additional issuesAdditional issues
• Computational issues: FFT vs. eigen‐decomposition
• Regularity of vibration patterns lost• Regularity of vibration patterns lost
– Difficult to characterize eigenvectors, eigenvalue not
enough
Non trivial to compare two sets of eigenvectors how– Non‐trivial to compare two sets of eigenvectors ⎯ how
to pair up?
![Page 57: Spectral Algorithms I](https://reader031.vdocument.in/reader031/viewer/2022012105/61dc0755927dcb226e0af43e/html5/thumbnails/57.jpg)
ConclusionConclusion
Use eigen‐structure of “well‐behaved” linear operators for geometry processingp g
Solve problem in a different domain via a spectral transform
Fourier analysis on meshes
Dimensionality reduction: effective and simplifying
Captures global and intrinsic shape characteristics