statistical estimation in the presence of group …aw128/thesis-slides.pdfi group theory,...

193
Statistical Estimation in the Presence of Group Actions Alex Wein MIT Mathematics 1 / 39

Upload: others

Post on 24-Jun-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Statistical Estimation in the Presence of GroupActions

Alex WeinMIT Mathematics

1 / 39

Page 2: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

In memoriam

Amelia Perry1991 – 2018

2 / 39

Page 3: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

My research interests

I Statistical and computational limits of average-case inferenceproblems (signal planted in random noise)

I Community detection (stochastic block model)I Spiked matrix/tensor problemsI Synchronization / group actions (today)

I Connections to...

I Statistical physics

I Phase transitions: easy, hard, impossible

I Algebra

I Group theory, representation theory, invariant theory

I Today: problems involving group actions

I A meeting point of statistics, algebra, signal processingcomputer science, statistical physics, . . .

3 / 39

Page 4: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

My research interests

I Statistical and computational limits of average-case inferenceproblems (signal planted in random noise)

I Community detection (stochastic block model)I Spiked matrix/tensor problemsI Synchronization / group actions (today)

I Connections to...

I Statistical physics

I Phase transitions: easy, hard, impossible

I Algebra

I Group theory, representation theory, invariant theory

I Today: problems involving group actions

I A meeting point of statistics, algebra, signal processingcomputer science, statistical physics, . . .

3 / 39

Page 5: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

My research interests

I Statistical and computational limits of average-case inferenceproblems (signal planted in random noise)

I Community detection (stochastic block model)I Spiked matrix/tensor problemsI Synchronization / group actions (today)

I Connections to...

I Statistical physics

I Phase transitions: easy, hard, impossible

I Algebra

I Group theory, representation theory, invariant theory

I Today: problems involving group actions

I A meeting point of statistics, algebra, signal processingcomputer science, statistical physics, . . .

3 / 39

Page 6: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

My research interests

I Statistical and computational limits of average-case inferenceproblems (signal planted in random noise)

I Community detection (stochastic block model)I Spiked matrix/tensor problemsI Synchronization / group actions (today)

I Connections to...

I Statistical physics

I Phase transitions: easy, hard, impossible

I Algebra

I Group theory, representation theory, invariant theory

I Today: problems involving group actions

I A meeting point of statistics, algebra, signal processingcomputer science, statistical physics, . . .

3 / 39

Page 7: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

My research interests

I Statistical and computational limits of average-case inferenceproblems (signal planted in random noise)

I Community detection (stochastic block model)I Spiked matrix/tensor problemsI Synchronization / group actions (today)

I Connections to...

I Statistical physics

I Phase transitions: easy, hard, impossible

I Algebra

I Group theory, representation theory, invariant theory

I Today: problems involving group actions

I A meeting point of statistics, algebra, signal processingcomputer science, statistical physics, . . .

3 / 39

Page 8: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

My research interests

I Statistical and computational limits of average-case inferenceproblems (signal planted in random noise)

I Community detection (stochastic block model)I Spiked matrix/tensor problemsI Synchronization / group actions (today)

I Connections to...

I Statistical physics

I Phase transitions: easy, hard, impossible

I Algebra

I Group theory, representation theory, invariant theory

I Today: problems involving group actions

I A meeting point of statistics, algebra, signal processingcomputer science, statistical physics, . . .

3 / 39

Page 9: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Motivation: cryo-electron microscopy (cryo-EM)

Image credit: [Singer, Shkolnisky ’11]

I Biological imaging method: determine structure of moleculeI 2017 Nobel Prize in ChemistryI Given many noisy 2D images of a 3D molecule, taken from

different unknown anglesI Goal is to reconstruct the 3D structure of the moleculeI Group action by SO(3) (rotations in 3D)

4 / 39

Page 10: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Motivation: cryo-electron microscopy (cryo-EM)

Image credit: [Singer, Shkolnisky ’11]

I Biological imaging method: determine structure of molecule

I 2017 Nobel Prize in ChemistryI Given many noisy 2D images of a 3D molecule, taken from

different unknown anglesI Goal is to reconstruct the 3D structure of the moleculeI Group action by SO(3) (rotations in 3D)

4 / 39

Page 11: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Motivation: cryo-electron microscopy (cryo-EM)

Image credit: [Singer, Shkolnisky ’11]

I Biological imaging method: determine structure of moleculeI 2017 Nobel Prize in Chemistry

I Given many noisy 2D images of a 3D molecule, taken fromdifferent unknown angles

I Goal is to reconstruct the 3D structure of the moleculeI Group action by SO(3) (rotations in 3D)

4 / 39

Page 12: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Motivation: cryo-electron microscopy (cryo-EM)

Image credit: [Singer, Shkolnisky ’11]

I Biological imaging method: determine structure of moleculeI 2017 Nobel Prize in ChemistryI Given many noisy 2D images of a 3D molecule, taken from

different unknown angles

I Goal is to reconstruct the 3D structure of the moleculeI Group action by SO(3) (rotations in 3D)

4 / 39

Page 13: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Motivation: cryo-electron microscopy (cryo-EM)

Image credit: [Singer, Shkolnisky ’11]

I Biological imaging method: determine structure of moleculeI 2017 Nobel Prize in ChemistryI Given many noisy 2D images of a 3D molecule, taken from

different unknown anglesI Goal is to reconstruct the 3D structure of the molecule

I Group action by SO(3) (rotations in 3D)

4 / 39

Page 14: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Motivation: cryo-electron microscopy (cryo-EM)

Image credit: [Singer, Shkolnisky ’11]

I Biological imaging method: determine structure of moleculeI 2017 Nobel Prize in ChemistryI Given many noisy 2D images of a 3D molecule, taken from

different unknown anglesI Goal is to reconstruct the 3D structure of the moleculeI Group action by SO(3) (rotations in 3D)

4 / 39

Page 15: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Other examples

Other problems involving random group actions:

I Image registration

Image credit: [Bandeira, PhD thesis ’15]

Group: SO(2) (2D rotations)

I Multi-reference alignment

Image credit: Jonathan Weed

Group: Z/p (cyclic shifts)

I Applications: computer vision, radar, structural biology,robotics, geology, paleontology, ...

I Methods used in practice often lack provable guarantees...

5 / 39

Page 16: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Other examples

Other problems involving random group actions:

I Image registration

Image credit: [Bandeira, PhD thesis ’15]

Group: SO(2) (2D rotations)

I Multi-reference alignment

Image credit: Jonathan Weed

Group: Z/p (cyclic shifts)

I Applications: computer vision, radar, structural biology,robotics, geology, paleontology, ...

I Methods used in practice often lack provable guarantees...

5 / 39

Page 17: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Other examples

Other problems involving random group actions:

I Image registration

Image credit: [Bandeira, PhD thesis ’15]

Group: SO(2) (2D rotations)

I Multi-reference alignment

Image credit: Jonathan Weed

Group: Z/p (cyclic shifts)

I Applications: computer vision, radar, structural biology,robotics, geology, paleontology, ...

I Methods used in practice often lack provable guarantees...

5 / 39

Page 18: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Other examples

Other problems involving random group actions:

I Image registration

Image credit: [Bandeira, PhD thesis ’15]

Group: SO(2) (2D rotations)

I Multi-reference alignment

Image credit: Jonathan Weed

Group: Z/p (cyclic shifts)

I Applications: computer vision, radar, structural biology,robotics, geology, paleontology, ...

I Methods used in practice often lack provable guarantees...

5 / 39

Page 19: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Other examples

Other problems involving random group actions:

I Image registration

Image credit: [Bandeira, PhD thesis ’15]

Group: SO(2) (2D rotations)

I Multi-reference alignment

Image credit: Jonathan Weed

Group: Z/p (cyclic shifts)

I Applications: computer vision, radar, structural biology,robotics, geology, paleontology, ...

I Methods used in practice often lack provable guarantees...

5 / 39

Page 20: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Part I: Synchronization

6 / 39

Page 21: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Synchronization problems

The synchronization approach [1]: learn the group elements

I Fix a group GI e.g. SO(3)

I g ∈ Gn – vector of unknown group elementsI e.g. rotation of each image

I Given pairwise information: for each i < j , a noisymeasurement of gig

−1j

I e.g. by comparing two images

I Goal: recover g up to global right-multiplicationI can’t distinguish (g1, . . . , gn) from (g1h, . . . , gnh)

In cryo-EM: once you learn the rotations, it is possible toreconstruct a de-noised model of the molecule [2]

[1] Singer ’11

[2] Singer, Shkolnisky ’11

7 / 39

Page 22: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Synchronization problems

The synchronization approach [1]: learn the group elements

I Fix a group GI e.g. SO(3)

I g ∈ Gn – vector of unknown group elementsI e.g. rotation of each image

I Given pairwise information: for each i < j , a noisymeasurement of gig

−1j

I e.g. by comparing two images

I Goal: recover g up to global right-multiplicationI can’t distinguish (g1, . . . , gn) from (g1h, . . . , gnh)

In cryo-EM: once you learn the rotations, it is possible toreconstruct a de-noised model of the molecule [2]

[1] Singer ’11

[2] Singer, Shkolnisky ’11

7 / 39

Page 23: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Synchronization problems

The synchronization approach [1]: learn the group elements

I Fix a group GI e.g. SO(3)

I g ∈ Gn – vector of unknown group elementsI e.g. rotation of each image

I Given pairwise information: for each i < j , a noisymeasurement of gig

−1j

I e.g. by comparing two images

I Goal: recover g up to global right-multiplicationI can’t distinguish (g1, . . . , gn) from (g1h, . . . , gnh)

In cryo-EM: once you learn the rotations, it is possible toreconstruct a de-noised model of the molecule [2]

[1] Singer ’11

[2] Singer, Shkolnisky ’11

7 / 39

Page 24: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Synchronization problems

The synchronization approach [1]: learn the group elements

I Fix a group GI e.g. SO(3)

I g ∈ Gn – vector of unknown group elementsI e.g. rotation of each image

I Given pairwise information: for each i < j , a noisymeasurement of gig

−1j

I e.g. by comparing two images

I Goal: recover g up to global right-multiplicationI can’t distinguish (g1, . . . , gn) from (g1h, . . . , gnh)

In cryo-EM: once you learn the rotations, it is possible toreconstruct a de-noised model of the molecule [2]

[1] Singer ’11

[2] Singer, Shkolnisky ’11

7 / 39

Page 25: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Synchronization problems

The synchronization approach [1]: learn the group elements

I Fix a group GI e.g. SO(3)

I g ∈ Gn – vector of unknown group elementsI e.g. rotation of each image

I Given pairwise information: for each i < j , a noisymeasurement of gig

−1j

I e.g. by comparing two images

I Goal: recover g up to global right-multiplicationI can’t distinguish (g1, . . . , gn) from (g1h, . . . , gnh)

In cryo-EM: once you learn the rotations, it is possible toreconstruct a de-noised model of the molecule [2]

[1] Singer ’11

[2] Singer, Shkolnisky ’11

7 / 39

Page 26: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Synchronization problems

The synchronization approach [1]: learn the group elements

I Fix a group GI e.g. SO(3)

I g ∈ Gn – vector of unknown group elementsI e.g. rotation of each image

I Given pairwise information: for each i < j , a noisymeasurement of gig

−1j

I e.g. by comparing two images

I Goal: recover g up to global right-multiplicationI can’t distinguish (g1, . . . , gn) from (g1h, . . . , gnh)

In cryo-EM: once you learn the rotations, it is possible toreconstruct a de-noised model of the molecule [2]

[1] Singer ’11

[2] Singer, Shkolnisky ’11

7 / 39

Page 27: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

A simple model: Gaussian Z/2 synchronization

I G = Z/2 = ±1

I True signal x ∈ ±1n (vector of group elements)

I For each i , j observe xixj +N (0, σ2)

I Specifically, observe n × n matrix Y =λ

nxx>︸ ︷︷ ︸

signal

+1√nW︸ ︷︷ ︸

noiseI λ ≥ 0 – signal-to-noise parameter

I W – random noise matrix: symmetric with entries N (0, 1)

I Yij is a noisy measurement of xixj (same/diff)

I Normalization: MMSE is a constant (depending on λ)

This is a spiked Wigner model: in general xi ∼ P (some prior)

Statistical physics makes extremely precise (non-rigorous)predictions about this type of problem

I Often later proved correct

8 / 39

Page 28: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

A simple model: Gaussian Z/2 synchronization

I G = Z/2 = ±1I True signal x ∈ ±1n (vector of group elements)

I For each i , j observe xixj +N (0, σ2)

I Specifically, observe n × n matrix Y =λ

nxx>︸ ︷︷ ︸

signal

+1√nW︸ ︷︷ ︸

noiseI λ ≥ 0 – signal-to-noise parameter

I W – random noise matrix: symmetric with entries N (0, 1)

I Yij is a noisy measurement of xixj (same/diff)

I Normalization: MMSE is a constant (depending on λ)

This is a spiked Wigner model: in general xi ∼ P (some prior)

Statistical physics makes extremely precise (non-rigorous)predictions about this type of problem

I Often later proved correct

8 / 39

Page 29: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

A simple model: Gaussian Z/2 synchronization

I G = Z/2 = ±1I True signal x ∈ ±1n (vector of group elements)

I For each i , j observe xixj +N (0, σ2)

I Specifically, observe n × n matrix Y =λ

nxx>︸ ︷︷ ︸

signal

+1√nW︸ ︷︷ ︸

noiseI λ ≥ 0 – signal-to-noise parameter

I W – random noise matrix: symmetric with entries N (0, 1)

I Yij is a noisy measurement of xixj (same/diff)

I Normalization: MMSE is a constant (depending on λ)

This is a spiked Wigner model: in general xi ∼ P (some prior)

Statistical physics makes extremely precise (non-rigorous)predictions about this type of problem

I Often later proved correct

8 / 39

Page 30: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

A simple model: Gaussian Z/2 synchronization

I G = Z/2 = ±1I True signal x ∈ ±1n (vector of group elements)

I For each i , j observe xixj +N (0, σ2)

I Specifically, observe n × n matrix Y =λ

nxx>︸ ︷︷ ︸

signal

+1√nW︸ ︷︷ ︸

noiseI λ ≥ 0 – signal-to-noise parameter

I W – random noise matrix: symmetric with entries N (0, 1)

I Yij is a noisy measurement of xixj (same/diff)

I Normalization: MMSE is a constant (depending on λ)

This is a spiked Wigner model: in general xi ∼ P (some prior)

Statistical physics makes extremely precise (non-rigorous)predictions about this type of problem

I Often later proved correct

8 / 39

Page 31: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

A simple model: Gaussian Z/2 synchronization

I G = Z/2 = ±1I True signal x ∈ ±1n (vector of group elements)

I For each i , j observe xixj +N (0, σ2)

I Specifically, observe n × n matrix Y =λ

nxx>︸ ︷︷ ︸

signal

+1√nW︸ ︷︷ ︸

noiseI λ ≥ 0 – signal-to-noise parameter

I W – random noise matrix: symmetric with entries N (0, 1)

I Yij is a noisy measurement of xixj (same/diff)

I Normalization: MMSE is a constant (depending on λ)

This is a spiked Wigner model: in general xi ∼ P (some prior)

Statistical physics makes extremely precise (non-rigorous)predictions about this type of problem

I Often later proved correct

8 / 39

Page 32: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

A simple model: Gaussian Z/2 synchronization

I G = Z/2 = ±1I True signal x ∈ ±1n (vector of group elements)

I For each i , j observe xixj +N (0, σ2)

I Specifically, observe n × n matrix Y =λ

nxx>︸ ︷︷ ︸

signal

+1√nW︸ ︷︷ ︸

noiseI λ ≥ 0 – signal-to-noise parameter

I W – random noise matrix: symmetric with entries N (0, 1)

I Yij is a noisy measurement of xixj (same/diff)

I Normalization: MMSE is a constant (depending on λ)

This is a spiked Wigner model: in general xi ∼ P (some prior)

Statistical physics makes extremely precise (non-rigorous)predictions about this type of problem

I Often later proved correct

8 / 39

Page 33: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

A simple model: Gaussian Z/2 synchronization

I G = Z/2 = ±1I True signal x ∈ ±1n (vector of group elements)

I For each i , j observe xixj +N (0, σ2)

I Specifically, observe n × n matrix Y =λ

nxx>︸ ︷︷ ︸

signal

+1√nW︸ ︷︷ ︸

noiseI λ ≥ 0 – signal-to-noise parameter

I W – random noise matrix: symmetric with entries N (0, 1)

I Yij is a noisy measurement of xixj (same/diff)

I Normalization: MMSE is a constant (depending on λ)

This is a spiked Wigner model: in general xi ∼ P (some prior)

Statistical physics makes extremely precise (non-rigorous)predictions about this type of problem

I Often later proved correct

8 / 39

Page 34: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

A simple model: Gaussian Z/2 synchronization

I G = Z/2 = ±1I True signal x ∈ ±1n (vector of group elements)

I For each i , j observe xixj +N (0, σ2)

I Specifically, observe n × n matrix Y =λ

nxx>︸ ︷︷ ︸

signal

+1√nW︸ ︷︷ ︸

noiseI λ ≥ 0 – signal-to-noise parameter

I W – random noise matrix: symmetric with entries N (0, 1)

I Yij is a noisy measurement of xixj (same/diff)

I Normalization: MMSE is a constant (depending on λ)

This is a spiked Wigner model: in general xi ∼ P (some prior)

Statistical physics makes extremely precise (non-rigorous)predictions about this type of problem

I Often later proved correct

8 / 39

Page 35: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

A simple model: Gaussian Z/2 synchronization

I G = Z/2 = ±1I True signal x ∈ ±1n (vector of group elements)

I Observe n × n matrix Y =λ

nxx>︸ ︷︷ ︸

signal

+1√nW︸ ︷︷ ︸

noise

Image credit: [Deshpande, Abbe, Montanari ’15]9 / 39

Page 36: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Statistical physics and inference

What does statistical physics have to do with Bayesian inference?

In inference, observe Y = λn xx

> + 1√nW and want to infer x

Posterior distribution: Pr[x |Y ] ∝ exp(λ x>Yx)

In physics, this is called a Boltzmann/Gibbs distribution:

Pr[x ] ∝ exp(−βH(x))

I Energy (“Hamiltonian”) H(x) = −x>YxI Temperature β = λ

So posterior distribution of Bayesian inference obeys the sameequations as a disordered physical system (e.g. magnet, spin glass)

10 / 39

Page 37: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Statistical physics and inference

What does statistical physics have to do with Bayesian inference?

In inference, observe Y = λn xx

> + 1√nW and want to infer x

Posterior distribution: Pr[x |Y ] ∝ exp(λ x>Yx)

In physics, this is called a Boltzmann/Gibbs distribution:

Pr[x ] ∝ exp(−βH(x))

I Energy (“Hamiltonian”) H(x) = −x>YxI Temperature β = λ

So posterior distribution of Bayesian inference obeys the sameequations as a disordered physical system (e.g. magnet, spin glass)

10 / 39

Page 38: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Statistical physics and inference

What does statistical physics have to do with Bayesian inference?

In inference, observe Y = λn xx

> + 1√nW and want to infer x

Posterior distribution: Pr[x |Y ] ∝ exp(λ x>Yx)

In physics, this is called a Boltzmann/Gibbs distribution:

Pr[x ] ∝ exp(−βH(x))

I Energy (“Hamiltonian”) H(x) = −x>YxI Temperature β = λ

So posterior distribution of Bayesian inference obeys the sameequations as a disordered physical system (e.g. magnet, spin glass)

10 / 39

Page 39: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Statistical physics and inference

What does statistical physics have to do with Bayesian inference?

In inference, observe Y = λn xx

> + 1√nW and want to infer x

Posterior distribution: Pr[x |Y ] ∝ exp(λ x>Yx)

In physics, this is called a Boltzmann/Gibbs distribution:

Pr[x ] ∝ exp(−βH(x))

I Energy (“Hamiltonian”) H(x) = −x>YxI Temperature β = λ

So posterior distribution of Bayesian inference obeys the sameequations as a disordered physical system (e.g. magnet, spin glass)

10 / 39

Page 40: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Statistical physics and inference

What does statistical physics have to do with Bayesian inference?

In inference, observe Y = λn xx

> + 1√nW and want to infer x

Posterior distribution: Pr[x |Y ] ∝ exp(λ x>Yx)

In physics, this is called a Boltzmann/Gibbs distribution:

Pr[x ] ∝ exp(−βH(x))

I Energy (“Hamiltonian”) H(x) = −x>YxI Temperature β = λ

So posterior distribution of Bayesian inference obeys the sameequations as a disordered physical system (e.g. magnet, spin glass)

10 / 39

Page 41: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

BP and AMP

“Axiom” from statistical physics: the best algorithm for every*problem is BP (belief propagation) [1]

I Each unknown xi is a “node”

I Each observation (“interaction”) Yij is an “edge”

I In our case, a complete graph

I Nodes iteratively pass “messages” or “beliefs” to each other alongedges, and then update their own beliefs

I Hard to analyze

In our case (since interactions are “dense”), we can use a simplificationof BP called AMP (approximate message passing) [2]

I Easy/possible to analyze

I Provably optimal mean squared error for many problems

[1] Pearl ’82

[2] Donoho, Maleki, Montanari ’09

11 / 39

Page 42: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

BP and AMP

“Axiom” from statistical physics: the best algorithm for every*problem is BP (belief propagation) [1]

I Each unknown xi is a “node”

I Each observation (“interaction”) Yij is an “edge”

I In our case, a complete graph

I Nodes iteratively pass “messages” or “beliefs” to each other alongedges, and then update their own beliefs

I Hard to analyze

In our case (since interactions are “dense”), we can use a simplificationof BP called AMP (approximate message passing) [2]

I Easy/possible to analyze

I Provably optimal mean squared error for many problems

[1] Pearl ’82

[2] Donoho, Maleki, Montanari ’09

11 / 39

Page 43: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

BP and AMP

“Axiom” from statistical physics: the best algorithm for every*problem is BP (belief propagation) [1]

I Each unknown xi is a “node”

I Each observation (“interaction”) Yij is an “edge”

I In our case, a complete graph

I Nodes iteratively pass “messages” or “beliefs” to each other alongedges, and then update their own beliefs

I Hard to analyze

In our case (since interactions are “dense”), we can use a simplificationof BP called AMP (approximate message passing) [2]

I Easy/possible to analyze

I Provably optimal mean squared error for many problems

[1] Pearl ’82

[2] Donoho, Maleki, Montanari ’09

11 / 39

Page 44: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

BP and AMP

“Axiom” from statistical physics: the best algorithm for every*problem is BP (belief propagation) [1]

I Each unknown xi is a “node”

I Each observation (“interaction”) Yij is an “edge”

I In our case, a complete graph

I Nodes iteratively pass “messages” or “beliefs” to each other alongedges, and then update their own beliefs

I Hard to analyze

In our case (since interactions are “dense”), we can use a simplificationof BP called AMP (approximate message passing) [2]

I Easy/possible to analyze

I Provably optimal mean squared error for many problems

[1] Pearl ’82

[2] Donoho, Maleki, Montanari ’09

11 / 39

Page 45: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

BP and AMP

“Axiom” from statistical physics: the best algorithm for every*problem is BP (belief propagation) [1]

I Each unknown xi is a “node”

I Each observation (“interaction”) Yij is an “edge”

I In our case, a complete graph

I Nodes iteratively pass “messages” or “beliefs” to each other alongedges, and then update their own beliefs

I Hard to analyze

In our case (since interactions are “dense”), we can use a simplificationof BP called AMP (approximate message passing) [2]

I Easy/possible to analyze

I Provably optimal mean squared error for many problems

[1] Pearl ’82

[2] Donoho, Maleki, Montanari ’09

11 / 39

Page 46: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

BP and AMP

“Axiom” from statistical physics: the best algorithm for every*problem is BP (belief propagation) [1]

I Each unknown xi is a “node”

I Each observation (“interaction”) Yij is an “edge”

I In our case, a complete graph

I Nodes iteratively pass “messages” or “beliefs” to each other alongedges, and then update their own beliefs

I Hard to analyze

In our case (since interactions are “dense”), we can use a simplificationof BP called AMP (approximate message passing) [2]

I Easy/possible to analyze

I Provably optimal mean squared error for many problems

[1] Pearl ’82

[2] Donoho, Maleki, Montanari ’09

11 / 39

Page 47: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

BP and AMP

“Axiom” from statistical physics: the best algorithm for every*problem is BP (belief propagation) [1]

I Each unknown xi is a “node”

I Each observation (“interaction”) Yij is an “edge”

I In our case, a complete graph

I Nodes iteratively pass “messages” or “beliefs” to each other alongedges, and then update their own beliefs

I Hard to analyze

In our case (since interactions are “dense”), we can use a simplificationof BP called AMP (approximate message passing) [2]

I Easy/possible to analyze

I Provably optimal mean squared error for many problems

[1] Pearl ’82

[2] Donoho, Maleki, Montanari ’09

11 / 39

Page 48: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

BP and AMP

“Axiom” from statistical physics: the best algorithm for every*problem is BP (belief propagation) [1]

I Each unknown xi is a “node”

I Each observation (“interaction”) Yij is an “edge”

I In our case, a complete graph

I Nodes iteratively pass “messages” or “beliefs” to each other alongedges, and then update their own beliefs

I Hard to analyze

In our case (since interactions are “dense”), we can use a simplificationof BP called AMP (approximate message passing) [2]

I Easy/possible to analyze

I Provably optimal mean squared error for many problems

[1] Pearl ’82

[2] Donoho, Maleki, Montanari ’09

11 / 39

Page 49: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

AMP for Z/2 synchronization

Y =λ

nxx> +

1√nW , x ∈ ±1n

AMP algorithm:

I State v ∈ Rn – estimate for xI Initialize v to small random vectorI Repeat:

1. Power iteration: v ← Yv (power iteration)2. Onsager: v ← v + [Onsager term]3. Entrywise soft projection: vi ← tanh(λvi ) (for all i)

I Resulting values in [−1, 1]

12 / 39

Page 50: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

AMP for Z/2 synchronization

Y =λ

nxx> +

1√nW , x ∈ ±1n

AMP algorithm:

I State v ∈ Rn – estimate for x

I Initialize v to small random vectorI Repeat:

1. Power iteration: v ← Yv (power iteration)2. Onsager: v ← v + [Onsager term]3. Entrywise soft projection: vi ← tanh(λvi ) (for all i)

I Resulting values in [−1, 1]

12 / 39

Page 51: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

AMP for Z/2 synchronization

Y =λ

nxx> +

1√nW , x ∈ ±1n

AMP algorithm:

I State v ∈ Rn – estimate for xI Initialize v to small random vector

I Repeat:

1. Power iteration: v ← Yv (power iteration)2. Onsager: v ← v + [Onsager term]3. Entrywise soft projection: vi ← tanh(λvi ) (for all i)

I Resulting values in [−1, 1]

12 / 39

Page 52: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

AMP for Z/2 synchronization

Y =λ

nxx> +

1√nW , x ∈ ±1n

AMP algorithm:

I State v ∈ Rn – estimate for xI Initialize v to small random vectorI Repeat:

1. Power iteration: v ← Yv (power iteration)

2. Onsager: v ← v + [Onsager term]3. Entrywise soft projection: vi ← tanh(λvi ) (for all i)

I Resulting values in [−1, 1]

12 / 39

Page 53: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

AMP for Z/2 synchronization

Y =λ

nxx> +

1√nW , x ∈ ±1n

AMP algorithm:

I State v ∈ Rn – estimate for xI Initialize v to small random vectorI Repeat:

1. Power iteration: v ← Yv (power iteration)2. Onsager: v ← v + [Onsager term]

3. Entrywise soft projection: vi ← tanh(λvi ) (for all i)I Resulting values in [−1, 1]

12 / 39

Page 54: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

AMP for Z/2 synchronization

Y =λ

nxx> +

1√nW , x ∈ ±1n

AMP algorithm:

I State v ∈ Rn – estimate for xI Initialize v to small random vectorI Repeat:

1. Power iteration: v ← Yv (power iteration)2. Onsager: v ← v + [Onsager term]3. Entrywise soft projection: vi ← tanh(λvi ) (for all i)

I Resulting values in [−1, 1]

12 / 39

Page 55: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

AMP is optimal

Y =λ

nxx> +

1√nW , x ∈ ±1n

For Z/2 synchronization, AMP is provably optimal.

Deshpande, Abbe, Montanari, ’15

13 / 39

Page 56: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Free energy landscapes

What do physics predictions look like?

f (γ) =1

λ

[−

λ2

4

(γ2

λ4+ 1

)+

1

λ2+ 1

)− E

z∼N (0,1)log(2 cosh(γ +

√γz))

]x-axis γ: correlation with true signal (related to MSE)

y-axis f : free energy – AMP’s “objective function” (minimize)

AMP – gradient descent starting from γ = 0 (left side)

STAT (statistical) – global minimum

So yields computational and statistical MSE for each λ

Lesieur, Krzakala, Zdeborova ’15

14 / 39

Page 57: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Free energy landscapes

What do physics predictions look like?

f (γ) =1

λ

[−

λ2

4

(γ2

λ4+ 1

)+

1

λ2+ 1

)− E

z∼N (0,1)log(2 cosh(γ +

√γz))

]

x-axis γ: correlation with true signal (related to MSE)

y-axis f : free energy – AMP’s “objective function” (minimize)

AMP – gradient descent starting from γ = 0 (left side)

STAT (statistical) – global minimum

So yields computational and statistical MSE for each λ

Lesieur, Krzakala, Zdeborova ’15

14 / 39

Page 58: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Free energy landscapes

What do physics predictions look like?

f (γ) =1

λ

[−

λ2

4

(γ2

λ4+ 1

)+

1

λ2+ 1

)− E

z∼N (0,1)log(2 cosh(γ +

√γz))

]

x-axis γ: correlation with true signal (related to MSE)

y-axis f : free energy – AMP’s “objective function” (minimize)

AMP – gradient descent starting from γ = 0 (left side)

STAT (statistical) – global minimum

So yields computational and statistical MSE for each λ

Lesieur, Krzakala, Zdeborova ’15

14 / 39

Page 59: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Free energy landscapes

What do physics predictions look like?

f (γ) =1

λ

[−

λ2

4

(γ2

λ4+ 1

)+

1

λ2+ 1

)− E

z∼N (0,1)log(2 cosh(γ +

√γz))

]

x-axis γ: correlation with true signal (related to MSE)

y-axis f : free energy – AMP’s “objective function” (minimize)

AMP – gradient descent starting from γ = 0 (left side)

STAT (statistical) – global minimum

So yields computational and statistical MSE for each λ

Lesieur, Krzakala, Zdeborova ’15

14 / 39

Page 60: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Free energy landscapes

What do physics predictions look like?

f (γ) =1

λ

[−

λ2

4

(γ2

λ4+ 1

)+

1

λ2+ 1

)− E

z∼N (0,1)log(2 cosh(γ +

√γz))

]

x-axis γ: correlation with true signal (related to MSE)

y-axis f : free energy – AMP’s “objective function” (minimize)

AMP – gradient descent starting from γ = 0 (left side)

STAT (statistical) – global minimum

So yields computational and statistical MSE for each λ

Lesieur, Krzakala, Zdeborova ’15

14 / 39

Page 61: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Our contributions

Joint work with Amelia Perry, Afonso Bandeira, Ankur Moitra

I Using representation theory we define a very general Gaussianobservation model for synchronization over any compact group

I Significantly generalizes Z/2 case

I We give a precise analysis of the statistical and computationallimits of this model

I Uses non-rigorous (but well-established) ideas from statisticalphysics

I Methods proven correct in related settings

I Includes an AMP algorithm which we believe is optimal amongall polynomial-time algorithms

I Also some rigorous statistical lower and upper bounds

Perry, W., Bandeira, Moitra, Message-passing algorithms for synchronization problems over compact groups,to appear in CPAM

Perry, W., Bandeira, Moitra, Optimality and Sub-optimality of PCA for Spiked Random Matrices andSynchronization, part I to appear in Ann. Stat

15 / 39

Page 62: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Our contributions

Joint work with Amelia Perry, Afonso Bandeira, Ankur Moitra

I Using representation theory we define a very general Gaussianobservation model for synchronization over any compact group

I Significantly generalizes Z/2 case

I We give a precise analysis of the statistical and computationallimits of this model

I Uses non-rigorous (but well-established) ideas from statisticalphysics

I Methods proven correct in related settings

I Includes an AMP algorithm which we believe is optimal amongall polynomial-time algorithms

I Also some rigorous statistical lower and upper bounds

Perry, W., Bandeira, Moitra, Message-passing algorithms for synchronization problems over compact groups,to appear in CPAM

Perry, W., Bandeira, Moitra, Optimality and Sub-optimality of PCA for Spiked Random Matrices andSynchronization, part I to appear in Ann. Stat

15 / 39

Page 63: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Our contributions

Joint work with Amelia Perry, Afonso Bandeira, Ankur Moitra

I Using representation theory we define a very general Gaussianobservation model for synchronization over any compact group

I Significantly generalizes Z/2 case

I We give a precise analysis of the statistical and computationallimits of this model

I Uses non-rigorous (but well-established) ideas from statisticalphysics

I Methods proven correct in related settings

I Includes an AMP algorithm which we believe is optimal amongall polynomial-time algorithms

I Also some rigorous statistical lower and upper bounds

Perry, W., Bandeira, Moitra, Message-passing algorithms for synchronization problems over compact groups,to appear in CPAM

Perry, W., Bandeira, Moitra, Optimality and Sub-optimality of PCA for Spiked Random Matrices andSynchronization, part I to appear in Ann. Stat

15 / 39

Page 64: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Our contributions

Joint work with Amelia Perry, Afonso Bandeira, Ankur Moitra

I Using representation theory we define a very general Gaussianobservation model for synchronization over any compact group

I Significantly generalizes Z/2 case

I We give a precise analysis of the statistical and computationallimits of this model

I Uses non-rigorous (but well-established) ideas from statisticalphysics

I Methods proven correct in related settings

I Includes an AMP algorithm which we believe is optimal amongall polynomial-time algorithms

I Also some rigorous statistical lower and upper bounds

Perry, W., Bandeira, Moitra, Message-passing algorithms for synchronization problems over compact groups,to appear in CPAM

Perry, W., Bandeira, Moitra, Optimality and Sub-optimality of PCA for Spiked Random Matrices andSynchronization, part I to appear in Ann. Stat

15 / 39

Page 65: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Our contributions

Joint work with Amelia Perry, Afonso Bandeira, Ankur Moitra

I Using representation theory we define a very general Gaussianobservation model for synchronization over any compact group

I Significantly generalizes Z/2 case

I We give a precise analysis of the statistical and computationallimits of this model

I Uses non-rigorous (but well-established) ideas from statisticalphysics

I Methods proven correct in related settings

I Includes an AMP algorithm which we believe is optimal amongall polynomial-time algorithms

I Also some rigorous statistical lower and upper bounds

Perry, W., Bandeira, Moitra, Message-passing algorithms for synchronization problems over compact groups,to appear in CPAM

Perry, W., Bandeira, Moitra, Optimality and Sub-optimality of PCA for Spiked Random Matrices andSynchronization, part I to appear in Ann. Stat

15 / 39

Page 66: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Our contributions

Joint work with Amelia Perry, Afonso Bandeira, Ankur Moitra

I Using representation theory we define a very general Gaussianobservation model for synchronization over any compact group

I Significantly generalizes Z/2 case

I We give a precise analysis of the statistical and computationallimits of this model

I Uses non-rigorous (but well-established) ideas from statisticalphysics

I Methods proven correct in related settings

I Includes an AMP algorithm which we believe is optimal amongall polynomial-time algorithms

I Also some rigorous statistical lower and upper bounds

Perry, W., Bandeira, Moitra, Message-passing algorithms for synchronization problems over compact groups,to appear in CPAM

Perry, W., Bandeira, Moitra, Optimality and Sub-optimality of PCA for Spiked Random Matrices andSynchronization, part I to appear in Ann. Stat

15 / 39

Page 67: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Our contributions

Joint work with Amelia Perry, Afonso Bandeira, Ankur Moitra

I Using representation theory we define a very general Gaussianobservation model for synchronization over any compact group

I Significantly generalizes Z/2 case

I We give a precise analysis of the statistical and computationallimits of this model

I Uses non-rigorous (but well-established) ideas from statisticalphysics

I Methods proven correct in related settings

I Includes an AMP algorithm which we believe is optimal amongall polynomial-time algorithms

I Also some rigorous statistical lower and upper bounds

Perry, W., Bandeira, Moitra, Message-passing algorithms for synchronization problems over compact groups,to appear in CPAM

Perry, W., Bandeira, Moitra, Optimality and Sub-optimality of PCA for Spiked Random Matrices andSynchronization, part I to appear in Ann. Stat

15 / 39

Page 68: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Multi-frequency U(1) synchronization

I G = U(1) = z ∈ C : |z | = 1 (angles)

I True signal x ∈ U(1)n

I W – complex Gaussian noise (GUE)I Observe

Y (1) =λ1

nxx∗ +

1√nW (1)

Y (2) =λ2

nx2x∗2 +

1√nW (2)

· · ·

Y (K) =λKn

xKx∗K +1√nW (K)

where xk means entry-wise kth power.

I This model has information on different frequenciesI Challenge: how to synthesize information across frequencies?

16 / 39

Page 69: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Multi-frequency U(1) synchronization

I G = U(1) = z ∈ C : |z | = 1 (angles)I True signal x ∈ U(1)n

I W – complex Gaussian noise (GUE)I Observe

Y (1) =λ1

nxx∗ +

1√nW (1)

Y (2) =λ2

nx2x∗2 +

1√nW (2)

· · ·

Y (K) =λKn

xKx∗K +1√nW (K)

where xk means entry-wise kth power.

I This model has information on different frequenciesI Challenge: how to synthesize information across frequencies?

16 / 39

Page 70: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Multi-frequency U(1) synchronization

I G = U(1) = z ∈ C : |z | = 1 (angles)I True signal x ∈ U(1)n

I W – complex Gaussian noise (GUE)

I Observe

Y (1) =λ1

nxx∗ +

1√nW (1)

Y (2) =λ2

nx2x∗2 +

1√nW (2)

· · ·

Y (K) =λKn

xKx∗K +1√nW (K)

where xk means entry-wise kth power.

I This model has information on different frequenciesI Challenge: how to synthesize information across frequencies?

16 / 39

Page 71: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Multi-frequency U(1) synchronization

I G = U(1) = z ∈ C : |z | = 1 (angles)I True signal x ∈ U(1)n

I W – complex Gaussian noise (GUE)I Observe

Y (1) =λ1

nxx∗ +

1√nW (1)

Y (2) =λ2

nx2x∗2 +

1√nW (2)

· · ·

Y (K) =λKn

xKx∗K +1√nW (K)

where xk means entry-wise kth power.

I This model has information on different frequenciesI Challenge: how to synthesize information across frequencies?

16 / 39

Page 72: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Multi-frequency U(1) synchronization

I G = U(1) = z ∈ C : |z | = 1 (angles)I True signal x ∈ U(1)n

I W – complex Gaussian noise (GUE)I Observe

Y (1) =λ1

nxx∗ +

1√nW (1)

Y (2) =λ2

nx2x∗2 +

1√nW (2)

· · ·

Y (K) =λKn

xKx∗K +1√nW (K)

where xk means entry-wise kth power.

I This model has information on different frequenciesI Challenge: how to synthesize information across frequencies?

16 / 39

Page 73: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Multi-frequency U(1) synchronization

I G = U(1) = z ∈ C : |z | = 1 (angles)I True signal x ∈ U(1)n

I W – complex Gaussian noise (GUE)I Observe

Y (1) =λ1

nxx∗ +

1√nW (1)

Y (2) =λ2

nx2x∗2 +

1√nW (2)

· · ·

Y (K) =λKn

xKx∗K +1√nW (K)

where xk means entry-wise kth power.

I This model has information on different frequenciesI Challenge: how to synthesize information across frequencies?

16 / 39

Page 74: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Multi-frequency U(1) synchronization

I G = U(1) = z ∈ C : |z | = 1 (angles)I True signal x ∈ U(1)n

I W – complex Gaussian noise (GUE)I Observe

Y (1) =λ1

nxx∗ +

1√nW (1)

Y (2) =λ2

nx2x∗2 +

1√nW (2)

· · ·

Y (K) =λKn

xKx∗K +1√nW (K)

where xk means entry-wise kth power.

I This model has information on different frequencies

I Challenge: how to synthesize information across frequencies?

16 / 39

Page 75: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Multi-frequency U(1) synchronization

I G = U(1) = z ∈ C : |z | = 1 (angles)I True signal x ∈ U(1)n

I W – complex Gaussian noise (GUE)I Observe

Y (1) =λ1

nxx∗ +

1√nW (1)

Y (2) =λ2

nx2x∗2 +

1√nW (2)

· · ·

Y (K) =λKn

xKx∗K +1√nW (K)

where xk means entry-wise kth power.

I This model has information on different frequenciesI Challenge: how to synthesize information across frequencies?

16 / 39

Page 76: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

AMP for U(1) synchronization

Y (k) =λknxkx∗k +

1√nW (k) for k = 1, . . . ,K

Algorithm’s state: v (k) ∈ Cn for each frequency k

I v (k) is an estimate of (xk1 , . . . , xkn )

AMP algorithm:

I Power iteration (separately on each frequency):v (k) ← Y (k)v (k)

I “Soft projection” (separately on each index i): v(·)i ← F(v

(·)i )

I This synthesizes the frequencies in a non-trivial way

I Onsager correction term

Analysis of AMP:

I Exact expression for AMP’s MSE (as n→∞) as a function ofλ1, . . . , λK

I Also, exact expression for the statistically optimal MSE

17 / 39

Page 77: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

AMP for U(1) synchronization

Y (k) =λknxkx∗k +

1√nW (k) for k = 1, . . . ,K

Algorithm’s state: v (k) ∈ Cn for each frequency k

I v (k) is an estimate of (xk1 , . . . , xkn )

AMP algorithm:

I Power iteration (separately on each frequency):v (k) ← Y (k)v (k)

I “Soft projection” (separately on each index i): v(·)i ← F(v

(·)i )

I This synthesizes the frequencies in a non-trivial way

I Onsager correction term

Analysis of AMP:

I Exact expression for AMP’s MSE (as n→∞) as a function ofλ1, . . . , λK

I Also, exact expression for the statistically optimal MSE

17 / 39

Page 78: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

AMP for U(1) synchronization

Y (k) =λknxkx∗k +

1√nW (k) for k = 1, . . . ,K

Algorithm’s state: v (k) ∈ Cn for each frequency k

I v (k) is an estimate of (xk1 , . . . , xkn )

AMP algorithm:

I Power iteration (separately on each frequency):v (k) ← Y (k)v (k)

I “Soft projection” (separately on each index i): v(·)i ← F(v

(·)i )

I This synthesizes the frequencies in a non-trivial way

I Onsager correction term

Analysis of AMP:

I Exact expression for AMP’s MSE (as n→∞) as a function ofλ1, . . . , λK

I Also, exact expression for the statistically optimal MSE

17 / 39

Page 79: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

AMP for U(1) synchronization

Y (k) =λknxkx∗k +

1√nW (k) for k = 1, . . . ,K

Algorithm’s state: v (k) ∈ Cn for each frequency k

I v (k) is an estimate of (xk1 , . . . , xkn )

AMP algorithm:

I Power iteration (separately on each frequency):v (k) ← Y (k)v (k)

I “Soft projection” (separately on each index i): v(·)i ← F(v

(·)i )

I This synthesizes the frequencies in a non-trivial way

I Onsager correction term

Analysis of AMP:

I Exact expression for AMP’s MSE (as n→∞) as a function ofλ1, . . . , λK

I Also, exact expression for the statistically optimal MSE

17 / 39

Page 80: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

AMP for U(1) synchronization

Y (k) =λknxkx∗k +

1√nW (k) for k = 1, . . . ,K

Algorithm’s state: v (k) ∈ Cn for each frequency k

I v (k) is an estimate of (xk1 , . . . , xkn )

AMP algorithm:

I Power iteration (separately on each frequency):v (k) ← Y (k)v (k)

I “Soft projection” (separately on each index i): v(·)i ← F(v

(·)i )

I This synthesizes the frequencies in a non-trivial way

I Onsager correction term

Analysis of AMP:

I Exact expression for AMP’s MSE (as n→∞) as a function ofλ1, . . . , λK

I Also, exact expression for the statistically optimal MSE

17 / 39

Page 81: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

AMP for U(1) synchronization

Y (k) =λknxkx∗k +

1√nW (k) for k = 1, . . . ,K

Algorithm’s state: v (k) ∈ Cn for each frequency k

I v (k) is an estimate of (xk1 , . . . , xkn )

AMP algorithm:

I Power iteration (separately on each frequency):v (k) ← Y (k)v (k)

I “Soft projection” (separately on each index i): v(·)i ← F(v

(·)i )

I This synthesizes the frequencies in a non-trivial way

I Onsager correction term

Analysis of AMP:

I Exact expression for AMP’s MSE (as n→∞) as a function ofλ1, . . . , λK

I Also, exact expression for the statistically optimal MSE

17 / 39

Page 82: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

AMP for U(1) synchronization

Y (k) =λknxkx∗k +

1√nW (k) for k = 1, . . . ,K

Algorithm’s state: v (k) ∈ Cn for each frequency k

I v (k) is an estimate of (xk1 , . . . , xkn )

AMP algorithm:

I Power iteration (separately on each frequency):v (k) ← Y (k)v (k)

I “Soft projection” (separately on each index i): v(·)i ← F(v

(·)i )

I This synthesizes the frequencies in a non-trivial way

I Onsager correction term

Analysis of AMP:

I Exact expression for AMP’s MSE (as n→∞) as a function ofλ1, . . . , λK

I Also, exact expression for the statistically optimal MSE17 / 39

Page 83: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Results for U(1) synchronization

Y (k) =λknxkx∗k +

1√nW (k) for k = 1, . . . ,K

I Single frequency: given Y (k), can non-trivially estimate xk iffλk > 1

I Information-theoretically, with λ1 = · · · = λK = λ, needλ ∼

√logK/K (for large K )

I But AMP (and conjecturally, any poly-time algorithm)requires λk > 1 for some k

I Computationally hard to synthesize sub-critical (λ ≤ 1)frequencies

I But once above the λ = 1 threshold, adding frequencies helpsreduce MSE of AMP

18 / 39

Page 84: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Results for U(1) synchronization

Y (k) =λknxkx∗k +

1√nW (k) for k = 1, . . . ,K

I Single frequency: given Y (k), can non-trivially estimate xk iffλk > 1

I Information-theoretically, with λ1 = · · · = λK = λ, needλ ∼

√logK/K (for large K )

I But AMP (and conjecturally, any poly-time algorithm)requires λk > 1 for some k

I Computationally hard to synthesize sub-critical (λ ≤ 1)frequencies

I But once above the λ = 1 threshold, adding frequencies helpsreduce MSE of AMP

18 / 39

Page 85: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Results for U(1) synchronization

Y (k) =λknxkx∗k +

1√nW (k) for k = 1, . . . ,K

I Single frequency: given Y (k), can non-trivially estimate xk iffλk > 1

I Information-theoretically, with λ1 = · · · = λK = λ, needλ ∼

√logK/K (for large K )

I But AMP (and conjecturally, any poly-time algorithm)requires λk > 1 for some k

I Computationally hard to synthesize sub-critical (λ ≤ 1)frequencies

I But once above the λ = 1 threshold, adding frequencies helpsreduce MSE of AMP

18 / 39

Page 86: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Results for U(1) synchronization

Y (k) =λknxkx∗k +

1√nW (k) for k = 1, . . . ,K

I Single frequency: given Y (k), can non-trivially estimate xk iffλk > 1

I Information-theoretically, with λ1 = · · · = λK = λ, needλ ∼

√logK/K (for large K )

I But AMP (and conjecturally, any poly-time algorithm)requires λk > 1 for some k

I Computationally hard to synthesize sub-critical (λ ≤ 1)frequencies

I But once above the λ = 1 threshold, adding frequencies helpsreduce MSE of AMP

18 / 39

Page 87: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Results for U(1) synchronization

Y (k) =λknxkx∗k +

1√nW (k) for k = 1, . . . ,K

I Single frequency: given Y (k), can non-trivially estimate xk iffλk > 1

I Information-theoretically, with λ1 = · · · = λK = λ, needλ ∼

√logK/K (for large K )

I But AMP (and conjecturally, any poly-time algorithm)requires λk > 1 for some k

I Computationally hard to synthesize sub-critical (λ ≤ 1)frequencies

I But once above the λ = 1 threshold, adding frequencies helpsreduce MSE of AMP

18 / 39

Page 88: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Results for U(1) synchronization

Y (k) =λknxkx∗k +

1√nW (k) for k = 1, . . . ,K

I Single frequency: given Y (k), can non-trivially estimate xk iffλk > 1

I Information-theoretically, with λ1 = · · · = λK = λ, needλ ∼

√logK/K (for large K )

I But AMP (and conjecturally, any poly-time algorithm)requires λk > 1 for some k

I Computationally hard to synthesize sub-critical (λ ≤ 1)frequencies

I But once above the λ = 1 threshold, adding frequencies helpsreduce MSE of AMP

18 / 39

Page 89: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Results for U(1) synchronization

Solid: AMP (n = 100) (K = num freq)Dotted: theoretical (n→∞)Same λ on each frequency

Image credit: Perry, W., Bandeira, Moitra, Message-passing algorithms for synchronization problems overcompact groups, to appear in CPAM

19 / 39

Page 90: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

General groups

All of the above extends to any compact group

I E.g. Any finite group; SO(3)

How to even define the model?

I Need to add “noise” to a group element gig−1j

Answer: Use representation theory to represent a group element asa matrix (and then add Gaussian noise)

I A representation ρ of G is a way to assign a matrix ρ(g) toeach g ∈ G

I Formally, a homomorphismρ : G → GL(Cd) = d × d invertible matrices

Frequencies are replaced by irreducible representations of G

I Fourier theory for functions G → C

For U(1), 1D irreducible representation for each k : ρk(g) = gk

20 / 39

Page 91: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

General groups

All of the above extends to any compact group

I E.g. Any finite group; SO(3)

How to even define the model?

I Need to add “noise” to a group element gig−1j

Answer: Use representation theory to represent a group element asa matrix (and then add Gaussian noise)

I A representation ρ of G is a way to assign a matrix ρ(g) toeach g ∈ G

I Formally, a homomorphismρ : G → GL(Cd) = d × d invertible matrices

Frequencies are replaced by irreducible representations of G

I Fourier theory for functions G → C

For U(1), 1D irreducible representation for each k : ρk(g) = gk

20 / 39

Page 92: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

General groups

All of the above extends to any compact group

I E.g. Any finite group; SO(3)

How to even define the model?

I Need to add “noise” to a group element gig−1j

Answer: Use representation theory to represent a group element asa matrix (and then add Gaussian noise)

I A representation ρ of G is a way to assign a matrix ρ(g) toeach g ∈ G

I Formally, a homomorphismρ : G → GL(Cd) = d × d invertible matrices

Frequencies are replaced by irreducible representations of G

I Fourier theory for functions G → C

For U(1), 1D irreducible representation for each k : ρk(g) = gk

20 / 39

Page 93: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

General groups

All of the above extends to any compact group

I E.g. Any finite group; SO(3)

How to even define the model?

I Need to add “noise” to a group element gig−1j

Answer: Use representation theory to represent a group element asa matrix (and then add Gaussian noise)

I A representation ρ of G is a way to assign a matrix ρ(g) toeach g ∈ G

I Formally, a homomorphismρ : G → GL(Cd) = d × d invertible matrices

Frequencies are replaced by irreducible representations of G

I Fourier theory for functions G → C

For U(1), 1D irreducible representation for each k : ρk(g) = gk

20 / 39

Page 94: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

General groups

All of the above extends to any compact group

I E.g. Any finite group; SO(3)

How to even define the model?

I Need to add “noise” to a group element gig−1j

Answer: Use representation theory to represent a group element asa matrix (and then add Gaussian noise)

I A representation ρ of G is a way to assign a matrix ρ(g) toeach g ∈ G

I Formally, a homomorphismρ : G → GL(Cd) = d × d invertible matrices

Frequencies are replaced by irreducible representations of G

I Fourier theory for functions G → C

For U(1), 1D irreducible representation for each k : ρk(g) = gk

20 / 39

Page 95: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

General groups

All of the above extends to any compact group

I E.g. Any finite group; SO(3)

How to even define the model?

I Need to add “noise” to a group element gig−1j

Answer: Use representation theory to represent a group element asa matrix (and then add Gaussian noise)

I A representation ρ of G is a way to assign a matrix ρ(g) toeach g ∈ G

I Formally, a homomorphismρ : G → GL(Cd) = d × d invertible matrices

Frequencies are replaced by irreducible representations of G

I Fourier theory for functions G → C

For U(1), 1D irreducible representation for each k : ρk(g) = gk

20 / 39

Page 96: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Part II: Orbit Recovery

21 / 39

Page 97: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Back to cryo-EM

Image credit: [Singer, Shkolnisky ’11]

Synchronization is not the ideal model for cryo-EM

I The synchronization approach disregards the underlying signal(the molecule)

I Our Gaussian synchronization model assumes independentnoise on each pair i , j of images, whereas actually there isindependent noise on each image

I For high noise, it is impossible to reliably recover the rotationsI So we should not try to estimate the rotations!

22 / 39

Page 98: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Back to cryo-EM

Image credit: [Singer, Shkolnisky ’11]

Synchronization is not the ideal model for cryo-EM

I The synchronization approach disregards the underlying signal(the molecule)

I Our Gaussian synchronization model assumes independentnoise on each pair i , j of images, whereas actually there isindependent noise on each image

I For high noise, it is impossible to reliably recover the rotationsI So we should not try to estimate the rotations!

22 / 39

Page 99: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Back to cryo-EM

Image credit: [Singer, Shkolnisky ’11]

Synchronization is not the ideal model for cryo-EM

I The synchronization approach disregards the underlying signal(the molecule)

I Our Gaussian synchronization model assumes independentnoise on each pair i , j of images, whereas actually there isindependent noise on each image

I For high noise, it is impossible to reliably recover the rotationsI So we should not try to estimate the rotations!

22 / 39

Page 100: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Back to cryo-EM

Image credit: [Singer, Shkolnisky ’11]

Synchronization is not the ideal model for cryo-EM

I The synchronization approach disregards the underlying signal(the molecule)

I Our Gaussian synchronization model assumes independentnoise on each pair i , j of images, whereas actually there isindependent noise on each image

I For high noise, it is impossible to reliably recover the rotationsI So we should not try to estimate the rotations!

22 / 39

Page 101: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Back to cryo-EM

Image credit: [Singer, Shkolnisky ’11]

Synchronization is not the ideal model for cryo-EM

I The synchronization approach disregards the underlying signal(the molecule)

I Our Gaussian synchronization model assumes independentnoise on each pair i , j of images, whereas actually there isindependent noise on each image

I For high noise, it is impossible to reliably recover the rotationsI So we should not try to estimate the rotations!

22 / 39

Page 102: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Orbit recovery problem

Let G be a compact group acting linearly and continuously on afinite-dimensional real vector space V = Rp.

I Compact: e.g. any finite group, SO(2), SO(3)

I Linear: ρ : G → GL(V ) = invertible p × p matrices(homomorphism)

I Action: g · x = ρ(g)x for g ∈ G , x ∈ V

I Continuous: ρ is continuous

23 / 39

Page 103: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Orbit recovery problem

Let G be a compact group acting linearly and continuously on afinite-dimensional real vector space V = Rp.

I Compact: e.g. any finite group, SO(2), SO(3)

I Linear: ρ : G → GL(V ) = invertible p × p matrices(homomorphism)

I Action: g · x = ρ(g)x for g ∈ G , x ∈ V

I Continuous: ρ is continuous

23 / 39

Page 104: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Orbit recovery problem

Let G be a compact group acting linearly and continuously on afinite-dimensional real vector space V = Rp.

I Compact: e.g. any finite group, SO(2), SO(3)

I Linear: ρ : G → GL(V ) = invertible p × p matrices(homomorphism)

I Action: g · x = ρ(g)x for g ∈ G , x ∈ V

I Continuous: ρ is continuous

23 / 39

Page 105: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Orbit recovery problem

Let G be a compact group acting linearly and continuously on afinite-dimensional real vector space V = Rp.

I Compact: e.g. any finite group, SO(2), SO(3)

I Linear: ρ : G → GL(V ) = invertible p × p matrices(homomorphism)

I Action: g · x = ρ(g)x for g ∈ G , x ∈ V

I Continuous: ρ is continuous

23 / 39

Page 106: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Orbit recovery problem

Let G be a compact group acting linearly and continuously on afinite-dimensional real vector space V = Rp.

I Compact: e.g. any finite group, SO(2), SO(3)

I Linear: ρ : G → GL(V ) = invertible p × p matrices(homomorphism)

I Action: g · x = ρ(g)x for g ∈ G , x ∈ V

I Continuous: ρ is continuous

23 / 39

Page 107: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Orbit recovery problem

Let G be a compact group acting linearly and continuously on afinite-dimensional real vector space V = Rp.

Unknown signal x ∈ V (e.g. the molecule)

For i = 1, . . . , n observe yi = gi · x + εi where. . .

I gi ∼ Haar(G ) (“uniform distribution” on G )

I εi ∼ N (0, σ2Ip) (noise)

Goal: Recover some x in the orbit g · x : g ∈ G of x

24 / 39

Page 108: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Orbit recovery problem

Let G be a compact group acting linearly and continuously on afinite-dimensional real vector space V = Rp.

Unknown signal x ∈ V (e.g. the molecule)

For i = 1, . . . , n observe yi = gi · x + εi where. . .

I gi ∼ Haar(G ) (“uniform distribution” on G )

I εi ∼ N (0, σ2Ip) (noise)

Goal: Recover some x in the orbit g · x : g ∈ G of x

24 / 39

Page 109: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Orbit recovery problem

Let G be a compact group acting linearly and continuously on afinite-dimensional real vector space V = Rp.

Unknown signal x ∈ V (e.g. the molecule)

For i = 1, . . . , n observe yi = gi · x + εi where. . .

I gi ∼ Haar(G ) (“uniform distribution” on G )

I εi ∼ N (0, σ2Ip) (noise)

Goal: Recover some x in the orbit g · x : g ∈ G of x

24 / 39

Page 110: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Orbit recovery problem

Let G be a compact group acting linearly and continuously on afinite-dimensional real vector space V = Rp.

Unknown signal x ∈ V (e.g. the molecule)

For i = 1, . . . , n observe yi = gi · x + εi where. . .

I gi ∼ Haar(G ) (“uniform distribution” on G )

I εi ∼ N (0, σ2Ip) (noise)

Goal: Recover some x in the orbit g · x : g ∈ G of x

24 / 39

Page 111: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Special case: multi-reference alignment (MRA)

G = Z/p acts on Rp via cyclic shiftsFor i = 1, . . . , n observe yi = gi · x + εi with εi ∼ N (0, σ2I)

Image credit: Jonathan Weed

Method of invariants [1,2] : measure features of the signal x thatare shift-invariantDegree-1:

∑i xi (mean)

Degree-2:∑

i x2i , x1x2 + x2x3 + · · ·+ xpx1, . . . (autocorrelation)

Degree-3: x1x2x4 + x2x3x5 + . . . (triple correlation)Invariant features are easy to estimate from the samples

[1] Bandeira, Rigollet, Weed, Optimal rates of estimation for multi-reference alignment, 2017

[2] Perry, Weed, Bandeira, Rigollet, Singer, The sample complexity of multi-reference alignment, 2017

25 / 39

Page 112: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Special case: multi-reference alignment (MRA)

G = Z/p acts on Rp via cyclic shifts

For i = 1, . . . , n observe yi = gi · x + εi with εi ∼ N (0, σ2I)

Method of invariants [1,2] : measure features of the signal x thatare shift-invariant

Degree-1:∑

i xi (mean)

Degree-2:∑

i x2i , x1x2 + x2x3 + · · ·+ xpx1, . . . (autocorrelation)

Degree-3: x1x2x4 + x2x3x5 + . . . (triple correlation)

Invariant features are easy to estimate from the samples

[1] Bandeira, Rigollet, Weed, Optimal rates of estimation for multi-reference alignment, 2017

[2] Perry, Weed, Bandeira, Rigollet, Singer, The sample complexity of multi-reference alignment, 2017

25 / 39

Page 113: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Special case: multi-reference alignment (MRA)

G = Z/p acts on Rp via cyclic shifts

For i = 1, . . . , n observe yi = gi · x + εi with εi ∼ N (0, σ2I)

Method of invariants [1,2] : measure features of the signal x thatare shift-invariant

Degree-1:

∑i xi (mean)

Degree-2:∑

i x2i , x1x2 + x2x3 + · · ·+ xpx1, . . . (autocorrelation)

Degree-3: x1x2x4 + x2x3x5 + . . . (triple correlation)

Invariant features are easy to estimate from the samples

[1] Bandeira, Rigollet, Weed, Optimal rates of estimation for multi-reference alignment, 2017

[2] Perry, Weed, Bandeira, Rigollet, Singer, The sample complexity of multi-reference alignment, 2017

25 / 39

Page 114: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Special case: multi-reference alignment (MRA)

G = Z/p acts on Rp via cyclic shifts

For i = 1, . . . , n observe yi = gi · x + εi with εi ∼ N (0, σ2I)

Method of invariants [1,2] : measure features of the signal x thatare shift-invariant

Degree-1:∑

i xi

(mean)

Degree-2:∑

i x2i , x1x2 + x2x3 + · · ·+ xpx1, . . . (autocorrelation)

Degree-3: x1x2x4 + x2x3x5 + . . . (triple correlation)

Invariant features are easy to estimate from the samples

[1] Bandeira, Rigollet, Weed, Optimal rates of estimation for multi-reference alignment, 2017

[2] Perry, Weed, Bandeira, Rigollet, Singer, The sample complexity of multi-reference alignment, 2017

25 / 39

Page 115: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Special case: multi-reference alignment (MRA)

G = Z/p acts on Rp via cyclic shifts

For i = 1, . . . , n observe yi = gi · x + εi with εi ∼ N (0, σ2I)

Method of invariants [1,2] : measure features of the signal x thatare shift-invariant

Degree-1:∑

i xi (mean)

Degree-2:∑

i x2i , x1x2 + x2x3 + · · ·+ xpx1, . . . (autocorrelation)

Degree-3: x1x2x4 + x2x3x5 + . . . (triple correlation)

Invariant features are easy to estimate from the samples

[1] Bandeira, Rigollet, Weed, Optimal rates of estimation for multi-reference alignment, 2017

[2] Perry, Weed, Bandeira, Rigollet, Singer, The sample complexity of multi-reference alignment, 2017

25 / 39

Page 116: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Special case: multi-reference alignment (MRA)

G = Z/p acts on Rp via cyclic shifts

For i = 1, . . . , n observe yi = gi · x + εi with εi ∼ N (0, σ2I)

Method of invariants [1,2] : measure features of the signal x thatare shift-invariant

Degree-1:∑

i xi (mean)

Degree-2:

∑i x

2i , x1x2 + x2x3 + · · ·+ xpx1, . . . (autocorrelation)

Degree-3: x1x2x4 + x2x3x5 + . . . (triple correlation)

Invariant features are easy to estimate from the samples

[1] Bandeira, Rigollet, Weed, Optimal rates of estimation for multi-reference alignment, 2017

[2] Perry, Weed, Bandeira, Rigollet, Singer, The sample complexity of multi-reference alignment, 2017

25 / 39

Page 117: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Special case: multi-reference alignment (MRA)

G = Z/p acts on Rp via cyclic shifts

For i = 1, . . . , n observe yi = gi · x + εi with εi ∼ N (0, σ2I)

Method of invariants [1,2] : measure features of the signal x thatare shift-invariant

Degree-1:∑

i xi (mean)

Degree-2:∑

i x2i

, x1x2 + x2x3 + · · ·+ xpx1, . . . (autocorrelation)

Degree-3: x1x2x4 + x2x3x5 + . . . (triple correlation)

Invariant features are easy to estimate from the samples

[1] Bandeira, Rigollet, Weed, Optimal rates of estimation for multi-reference alignment, 2017

[2] Perry, Weed, Bandeira, Rigollet, Singer, The sample complexity of multi-reference alignment, 2017

25 / 39

Page 118: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Special case: multi-reference alignment (MRA)

G = Z/p acts on Rp via cyclic shifts

For i = 1, . . . , n observe yi = gi · x + εi with εi ∼ N (0, σ2I)

Method of invariants [1,2] : measure features of the signal x thatare shift-invariant

Degree-1:∑

i xi (mean)

Degree-2:∑

i x2i , x1x2 + x2x3 + · · ·+ xpx1, . . .

(autocorrelation)

Degree-3: x1x2x4 + x2x3x5 + . . . (triple correlation)

Invariant features are easy to estimate from the samples

[1] Bandeira, Rigollet, Weed, Optimal rates of estimation for multi-reference alignment, 2017

[2] Perry, Weed, Bandeira, Rigollet, Singer, The sample complexity of multi-reference alignment, 2017

25 / 39

Page 119: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Special case: multi-reference alignment (MRA)

G = Z/p acts on Rp via cyclic shifts

For i = 1, . . . , n observe yi = gi · x + εi with εi ∼ N (0, σ2I)

Method of invariants [1,2] : measure features of the signal x thatare shift-invariant

Degree-1:∑

i xi (mean)

Degree-2:∑

i x2i , x1x2 + x2x3 + · · ·+ xpx1, . . . (autocorrelation)

Degree-3: x1x2x4 + x2x3x5 + . . . (triple correlation)

Invariant features are easy to estimate from the samples

[1] Bandeira, Rigollet, Weed, Optimal rates of estimation for multi-reference alignment, 2017

[2] Perry, Weed, Bandeira, Rigollet, Singer, The sample complexity of multi-reference alignment, 2017

25 / 39

Page 120: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Special case: multi-reference alignment (MRA)

G = Z/p acts on Rp via cyclic shifts

For i = 1, . . . , n observe yi = gi · x + εi with εi ∼ N (0, σ2I)

Method of invariants [1,2] : measure features of the signal x thatare shift-invariant

Degree-1:∑

i xi (mean)

Degree-2:∑

i x2i , x1x2 + x2x3 + · · ·+ xpx1, . . . (autocorrelation)

Degree-3:

x1x2x4 + x2x3x5 + . . . (triple correlation)

Invariant features are easy to estimate from the samples

[1] Bandeira, Rigollet, Weed, Optimal rates of estimation for multi-reference alignment, 2017

[2] Perry, Weed, Bandeira, Rigollet, Singer, The sample complexity of multi-reference alignment, 2017

25 / 39

Page 121: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Special case: multi-reference alignment (MRA)

G = Z/p acts on Rp via cyclic shifts

For i = 1, . . . , n observe yi = gi · x + εi with εi ∼ N (0, σ2I)

Method of invariants [1,2] : measure features of the signal x thatare shift-invariant

Degree-1:∑

i xi (mean)

Degree-2:∑

i x2i , x1x2 + x2x3 + · · ·+ xpx1, . . . (autocorrelation)

Degree-3: x1x2x4 + x2x3x5 + . . .

(triple correlation)

Invariant features are easy to estimate from the samples

[1] Bandeira, Rigollet, Weed, Optimal rates of estimation for multi-reference alignment, 2017

[2] Perry, Weed, Bandeira, Rigollet, Singer, The sample complexity of multi-reference alignment, 2017

25 / 39

Page 122: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Special case: multi-reference alignment (MRA)

G = Z/p acts on Rp via cyclic shifts

For i = 1, . . . , n observe yi = gi · x + εi with εi ∼ N (0, σ2I)

Method of invariants [1,2] : measure features of the signal x thatare shift-invariant

Degree-1:∑

i xi (mean)

Degree-2:∑

i x2i , x1x2 + x2x3 + · · ·+ xpx1, . . . (autocorrelation)

Degree-3: x1x2x4 + x2x3x5 + . . . (triple correlation)

Invariant features are easy to estimate from the samples

[1] Bandeira, Rigollet, Weed, Optimal rates of estimation for multi-reference alignment, 2017

[2] Perry, Weed, Bandeira, Rigollet, Singer, The sample complexity of multi-reference alignment, 2017

25 / 39

Page 123: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Special case: multi-reference alignment (MRA)

G = Z/p acts on Rp via cyclic shifts

For i = 1, . . . , n observe yi = gi · x + εi with εi ∼ N (0, σ2I)

Method of invariants [1,2] : measure features of the signal x thatare shift-invariant

Degree-1:∑

i xi (mean)

Degree-2:∑

i x2i , x1x2 + x2x3 + · · ·+ xpx1, . . . (autocorrelation)

Degree-3: x1x2x4 + x2x3x5 + . . . (triple correlation)

Invariant features are easy to estimate from the samples

[1] Bandeira, Rigollet, Weed, Optimal rates of estimation for multi-reference alignment, 2017

[2] Perry, Weed, Bandeira, Rigollet, Singer, The sample complexity of multi-reference alignment, 2017

25 / 39

Page 124: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Sample complexity

Theorem [1]:(Upper bound) With noise level σ, can estimate degree-dinvariants using n = O(σ2d) samples.

(Lower bound) If x (1), x (2) agree on all invariants of degree≤ d − 1 then Ω(σ2d) samples are required to distinguish them.

I Method of invariants is optimal

Question: What degree d∗ of invariants do we need to learn beforewe can recover (the orbit of) x?

I Optimal sample complexity is n = Θ(σ2d∗)

Answer (for MRA) [1]:

I For “generic” x , degree 3 is sufficient, so sample complexityn = Θ(σ6)

I But for a measure-zero set of “bad” signals, need much higherdegree (as high as p)

[1] Bandeira, Rigollet, Weed, Optimal rates of estimation for multi-reference alignment, 2017

26 / 39

Page 125: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Sample complexity

Theorem [1]:(Upper bound) With noise level σ, can estimate degree-dinvariants using n = O(σ2d) samples.(Lower bound) If x (1), x (2) agree on all invariants of degree≤ d − 1 then Ω(σ2d) samples are required to distinguish them.

I Method of invariants is optimal

Question: What degree d∗ of invariants do we need to learn beforewe can recover (the orbit of) x?

I Optimal sample complexity is n = Θ(σ2d∗)

Answer (for MRA) [1]:

I For “generic” x , degree 3 is sufficient, so sample complexityn = Θ(σ6)

I But for a measure-zero set of “bad” signals, need much higherdegree (as high as p)

[1] Bandeira, Rigollet, Weed, Optimal rates of estimation for multi-reference alignment, 2017

26 / 39

Page 126: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Sample complexity

Theorem [1]:(Upper bound) With noise level σ, can estimate degree-dinvariants using n = O(σ2d) samples.(Lower bound) If x (1), x (2) agree on all invariants of degree≤ d − 1 then Ω(σ2d) samples are required to distinguish them.

I Method of invariants is optimal

Question: What degree d∗ of invariants do we need to learn beforewe can recover (the orbit of) x?

I Optimal sample complexity is n = Θ(σ2d∗)

Answer (for MRA) [1]:

I For “generic” x , degree 3 is sufficient, so sample complexityn = Θ(σ6)

I But for a measure-zero set of “bad” signals, need much higherdegree (as high as p)

[1] Bandeira, Rigollet, Weed, Optimal rates of estimation for multi-reference alignment, 2017

26 / 39

Page 127: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Sample complexity

Theorem [1]:(Upper bound) With noise level σ, can estimate degree-dinvariants using n = O(σ2d) samples.(Lower bound) If x (1), x (2) agree on all invariants of degree≤ d − 1 then Ω(σ2d) samples are required to distinguish them.

I Method of invariants is optimal

Question: What degree d∗ of invariants do we need to learn beforewe can recover (the orbit of) x?

I Optimal sample complexity is n = Θ(σ2d∗)

Answer (for MRA) [1]:

I For “generic” x , degree 3 is sufficient, so sample complexityn = Θ(σ6)

I But for a measure-zero set of “bad” signals, need much higherdegree (as high as p)

[1] Bandeira, Rigollet, Weed, Optimal rates of estimation for multi-reference alignment, 2017

26 / 39

Page 128: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Sample complexity

Theorem [1]:(Upper bound) With noise level σ, can estimate degree-dinvariants using n = O(σ2d) samples.(Lower bound) If x (1), x (2) agree on all invariants of degree≤ d − 1 then Ω(σ2d) samples are required to distinguish them.

I Method of invariants is optimal

Question: What degree d∗ of invariants do we need to learn beforewe can recover (the orbit of) x?

I Optimal sample complexity is n = Θ(σ2d∗)

Answer (for MRA) [1]:

I For “generic” x , degree 3 is sufficient, so sample complexityn = Θ(σ6)

I But for a measure-zero set of “bad” signals, need much higherdegree (as high as p)

[1] Bandeira, Rigollet, Weed, Optimal rates of estimation for multi-reference alignment, 2017

26 / 39

Page 129: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Another viewpoint: mixtures of Gaussians

MRA sample: y = g · x + ε with g ∼ G , ε ∼ N (0, σ2I)

The distribution of y is a (uniform) mixture of |G | Gaussianscentered at g · x : g ∈ GI For infinite groups, a mixture of infinitely-many Gaussians

Method of moments: Estimate moments E[y ],E[yy>], . . ., E[y⊗d ]

E[y⊗k ] Eg [(g · x)⊗k ]

Fact: Moments are equivalent to invariants

I Eg [(g · x)⊗k ] contains the same information as the degree-kinvariant polynomials

27 / 39

Page 130: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Another viewpoint: mixtures of Gaussians

MRA sample: y = g · x + ε with g ∼ G , ε ∼ N (0, σ2I)

The distribution of y is a (uniform) mixture of |G | Gaussianscentered at g · x : g ∈ G

I For infinite groups, a mixture of infinitely-many Gaussians

Method of moments: Estimate moments E[y ],E[yy>], . . ., E[y⊗d ]

E[y⊗k ] Eg [(g · x)⊗k ]

Fact: Moments are equivalent to invariants

I Eg [(g · x)⊗k ] contains the same information as the degree-kinvariant polynomials

27 / 39

Page 131: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Another viewpoint: mixtures of Gaussians

MRA sample: y = g · x + ε with g ∼ G , ε ∼ N (0, σ2I)

The distribution of y is a (uniform) mixture of |G | Gaussianscentered at g · x : g ∈ GI For infinite groups, a mixture of infinitely-many Gaussians

Method of moments: Estimate moments E[y ],E[yy>], . . ., E[y⊗d ]

E[y⊗k ] Eg [(g · x)⊗k ]

Fact: Moments are equivalent to invariants

I Eg [(g · x)⊗k ] contains the same information as the degree-kinvariant polynomials

27 / 39

Page 132: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Another viewpoint: mixtures of Gaussians

MRA sample: y = g · x + ε with g ∼ G , ε ∼ N (0, σ2I)

The distribution of y is a (uniform) mixture of |G | Gaussianscentered at g · x : g ∈ GI For infinite groups, a mixture of infinitely-many Gaussians

Method of moments: Estimate moments E[y ],E[yy>], . . ., E[y⊗d ]

E[y⊗k ] Eg [(g · x)⊗k ]

Fact: Moments are equivalent to invariants

I Eg [(g · x)⊗k ] contains the same information as the degree-kinvariant polynomials

27 / 39

Page 133: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Another viewpoint: mixtures of Gaussians

MRA sample: y = g · x + ε with g ∼ G , ε ∼ N (0, σ2I)

The distribution of y is a (uniform) mixture of |G | Gaussianscentered at g · x : g ∈ GI For infinite groups, a mixture of infinitely-many Gaussians

Method of moments: Estimate moments E[y ],E[yy>], . . ., E[y⊗d ]

E[y⊗k ] Eg [(g · x)⊗k ]

Fact: Moments are equivalent to invariants

I Eg [(g · x)⊗k ] contains the same information as the degree-kinvariant polynomials

27 / 39

Page 134: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Another viewpoint: mixtures of Gaussians

MRA sample: y = g · x + ε with g ∼ G , ε ∼ N (0, σ2I)

The distribution of y is a (uniform) mixture of |G | Gaussianscentered at g · x : g ∈ GI For infinite groups, a mixture of infinitely-many Gaussians

Method of moments: Estimate moments E[y ],E[yy>], . . ., E[y⊗d ]

E[y⊗k ] Eg [(g · x)⊗k ]

Fact: Moments are equivalent to invariants

I Eg [(g · x)⊗k ] contains the same information as the degree-kinvariant polynomials

27 / 39

Page 135: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Our contributions

Joint work with Ben Blum-Smith, Afonso Bandeira, Amelia Perry,Jonathan Weed

I We generalize from MRA to any compact group

I Again, the method of invariants/moments is optimal

I We give an (inefficient) algorithm that achieves optimalsample complexity: solve polynomial system

I To determine what degree of invariants are required, we useinvariant theory and algebraic geometry

I How to tell if polynomial equations have a unique solution

Bandeira, Blum-Smith, Perry, Weed, W., Estimation under group actions: recovering orbits from invariants,2017

28 / 39

Page 136: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Our contributions

Joint work with Ben Blum-Smith, Afonso Bandeira, Amelia Perry,Jonathan Weed

I We generalize from MRA to any compact group

I Again, the method of invariants/moments is optimal

I We give an (inefficient) algorithm that achieves optimalsample complexity: solve polynomial system

I To determine what degree of invariants are required, we useinvariant theory and algebraic geometry

I How to tell if polynomial equations have a unique solution

Bandeira, Blum-Smith, Perry, Weed, W., Estimation under group actions: recovering orbits from invariants,2017

28 / 39

Page 137: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Our contributions

Joint work with Ben Blum-Smith, Afonso Bandeira, Amelia Perry,Jonathan Weed

I We generalize from MRA to any compact group

I Again, the method of invariants/moments is optimal

I We give an (inefficient) algorithm that achieves optimalsample complexity: solve polynomial system

I To determine what degree of invariants are required, we useinvariant theory and algebraic geometry

I How to tell if polynomial equations have a unique solution

Bandeira, Blum-Smith, Perry, Weed, W., Estimation under group actions: recovering orbits from invariants,2017

28 / 39

Page 138: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Our contributions

Joint work with Ben Blum-Smith, Afonso Bandeira, Amelia Perry,Jonathan Weed

I We generalize from MRA to any compact group

I Again, the method of invariants/moments is optimal

I We give an (inefficient) algorithm that achieves optimalsample complexity: solve polynomial system

I To determine what degree of invariants are required, we useinvariant theory and algebraic geometry

I How to tell if polynomial equations have a unique solution

Bandeira, Blum-Smith, Perry, Weed, W., Estimation under group actions: recovering orbits from invariants,2017

28 / 39

Page 139: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Our contributions

Joint work with Ben Blum-Smith, Afonso Bandeira, Amelia Perry,Jonathan Weed

I We generalize from MRA to any compact group

I Again, the method of invariants/moments is optimal

I We give an (inefficient) algorithm that achieves optimalsample complexity: solve polynomial system

I To determine what degree of invariants are required, we useinvariant theory and algebraic geometry

I How to tell if polynomial equations have a unique solution

Bandeira, Blum-Smith, Perry, Weed, W., Estimation under group actions: recovering orbits from invariants,2017

28 / 39

Page 140: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Invariant theory

Variables x1, . . . , xp (corresponding to the coordinates of x)

The invariant ring R[x]G is the subring of R[x] := R[x1, . . . , xp]consisting of polynomials f such that f (g · x) = f (x) ∀g ∈ G .

I Aside: A main result of invariant theory is that R[x]G isfinitely-generated

R[x]G≤d – invariants of degree ≤ d

(Simple) algorithm:

I Pick d∗ (to be chosen later)

I Using Θ(σ2d∗) samples, estimate invariants up to degree d∗:learn value f (x) for all f ∈ R[x]G≤d

I Solve for an x that is consistent with those values:f (x) = f (x) ∀f ∈ R[x]G≤d (polynomial system of equations)

29 / 39

Page 141: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Invariant theory

Variables x1, . . . , xp (corresponding to the coordinates of x)

The invariant ring R[x]G is the subring of R[x] := R[x1, . . . , xp]consisting of polynomials f such that f (g · x) = f (x) ∀g ∈ G .

I Aside: A main result of invariant theory is that R[x]G isfinitely-generated

R[x]G≤d – invariants of degree ≤ d

(Simple) algorithm:

I Pick d∗ (to be chosen later)

I Using Θ(σ2d∗) samples, estimate invariants up to degree d∗:learn value f (x) for all f ∈ R[x]G≤d

I Solve for an x that is consistent with those values:f (x) = f (x) ∀f ∈ R[x]G≤d (polynomial system of equations)

29 / 39

Page 142: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Invariant theory

Variables x1, . . . , xp (corresponding to the coordinates of x)

The invariant ring R[x]G is the subring of R[x] := R[x1, . . . , xp]consisting of polynomials f such that f (g · x) = f (x) ∀g ∈ G .

I Aside: A main result of invariant theory is that R[x]G isfinitely-generated

R[x]G≤d – invariants of degree ≤ d

(Simple) algorithm:

I Pick d∗ (to be chosen later)

I Using Θ(σ2d∗) samples, estimate invariants up to degree d∗:learn value f (x) for all f ∈ R[x]G≤d

I Solve for an x that is consistent with those values:f (x) = f (x) ∀f ∈ R[x]G≤d (polynomial system of equations)

29 / 39

Page 143: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Invariant theory

Variables x1, . . . , xp (corresponding to the coordinates of x)

The invariant ring R[x]G is the subring of R[x] := R[x1, . . . , xp]consisting of polynomials f such that f (g · x) = f (x) ∀g ∈ G .

I Aside: A main result of invariant theory is that R[x]G isfinitely-generated

R[x]G≤d – invariants of degree ≤ d

(Simple) algorithm:

I Pick d∗ (to be chosen later)

I Using Θ(σ2d∗) samples, estimate invariants up to degree d∗:learn value f (x) for all f ∈ R[x]G≤d

I Solve for an x that is consistent with those values:f (x) = f (x) ∀f ∈ R[x]G≤d (polynomial system of equations)

29 / 39

Page 144: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Invariant theory

Variables x1, . . . , xp (corresponding to the coordinates of x)

The invariant ring R[x]G is the subring of R[x] := R[x1, . . . , xp]consisting of polynomials f such that f (g · x) = f (x) ∀g ∈ G .

I Aside: A main result of invariant theory is that R[x]G isfinitely-generated

R[x]G≤d – invariants of degree ≤ d

(Simple) algorithm:

I Pick d∗ (to be chosen later)

I Using Θ(σ2d∗) samples, estimate invariants up to degree d∗:learn value f (x) for all f ∈ R[x]G≤d

I Solve for an x that is consistent with those values:f (x) = f (x) ∀f ∈ R[x]G≤d (polynomial system of equations)

29 / 39

Page 145: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

All invariants determine orbit

Theorem [1]: If G is compact, for every x ∈ V , the full invariantring R[x]G determines x up to orbit.

I In the sense that if x , x ′ do not lie in the same orbit, thereexists f ∈ R[x]G that separates them: f (x) 6= f (x ′)

Corollary: Suppose that for some d , R[x]G≤d generates R[x]G (as

an R-algebra). Then R[x]G≤d determines x up to orbit and so

sample complexity is O(σ2d).

Problem: This is for worst-case x ∈ V . For MRA (cyclic shifts)this requires d = p whereas generic x only requires d = 3 [2].

Actually care about whether R[x]G≤d generically determines R[x]G

[1] Kac, Invariant theory lecture notes, 1994

[2] Bandeira, Rigollet, Weed, Optimal rates of estimation for multi-reference alignment, 2017

30 / 39

Page 146: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

All invariants determine orbit

Theorem [1]: If G is compact, for every x ∈ V , the full invariantring R[x]G determines x up to orbit.

I In the sense that if x , x ′ do not lie in the same orbit, thereexists f ∈ R[x]G that separates them: f (x) 6= f (x ′)

Corollary: Suppose that for some d , R[x]G≤d generates R[x]G (as

an R-algebra). Then R[x]G≤d determines x up to orbit and so

sample complexity is O(σ2d).

Problem: This is for worst-case x ∈ V . For MRA (cyclic shifts)this requires d = p whereas generic x only requires d = 3 [2].

Actually care about whether R[x]G≤d generically determines R[x]G

[1] Kac, Invariant theory lecture notes, 1994

[2] Bandeira, Rigollet, Weed, Optimal rates of estimation for multi-reference alignment, 2017

30 / 39

Page 147: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

All invariants determine orbit

Theorem [1]: If G is compact, for every x ∈ V , the full invariantring R[x]G determines x up to orbit.

I In the sense that if x , x ′ do not lie in the same orbit, thereexists f ∈ R[x]G that separates them: f (x) 6= f (x ′)

Corollary: Suppose that for some d , R[x]G≤d generates R[x]G (as

an R-algebra). Then R[x]G≤d determines x up to orbit and so

sample complexity is O(σ2d).

Problem: This is for worst-case x ∈ V . For MRA (cyclic shifts)this requires d = p whereas generic x only requires d = 3 [2].

Actually care about whether R[x]G≤d generically determines R[x]G

[1] Kac, Invariant theory lecture notes, 1994

[2] Bandeira, Rigollet, Weed, Optimal rates of estimation for multi-reference alignment, 2017

30 / 39

Page 148: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

All invariants determine orbit

Theorem [1]: If G is compact, for every x ∈ V , the full invariantring R[x]G determines x up to orbit.

I In the sense that if x , x ′ do not lie in the same orbit, thereexists f ∈ R[x]G that separates them: f (x) 6= f (x ′)

Corollary: Suppose that for some d , R[x]G≤d generates R[x]G (as

an R-algebra). Then R[x]G≤d determines x up to orbit and so

sample complexity is O(σ2d).

Problem: This is for worst-case x ∈ V . For MRA (cyclic shifts)this requires d = p whereas generic x only requires d = 3 [2].

Actually care about whether R[x]G≤d generically determines R[x]G

[1] Kac, Invariant theory lecture notes, 1994

[2] Bandeira, Rigollet, Weed, Optimal rates of estimation for multi-reference alignment, 2017

30 / 39

Page 149: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Do polynomials generically determine other polynomials?

Say we have A ⊆ B ⊆ R[x]

I (Technically need to assume B is finitely generated)

Question: Do the values a(x) : a ∈ A generically determine thevalues b(x) : b ∈ B?

Definition: Polynomials f1, . . . , fm are algebraically independent ifthere is no P ∈ R[y1, . . . , ym] with P(f1, . . . , fm) ≡ 0.

Definition: For U ⊆ R[x], the transcendence degree trdeg(U) isthe number of algebraically independent polynomials in U.

Answer: Suppose trdeg(A) = trdeg(B). If x is “generic” then thevalues a(x) : a ∈ A determine a finite number of possibilitiesfor the entire collection b(x) : b ∈ B.I “Generic”: x lies in a particular full-measure set

31 / 39

Page 150: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Do polynomials generically determine other polynomials?

Say we have A ⊆ B ⊆ R[x]

I (Technically need to assume B is finitely generated)

Question: Do the values a(x) : a ∈ A generically determine thevalues b(x) : b ∈ B?

Definition: Polynomials f1, . . . , fm are algebraically independent ifthere is no P ∈ R[y1, . . . , ym] with P(f1, . . . , fm) ≡ 0.

Definition: For U ⊆ R[x], the transcendence degree trdeg(U) isthe number of algebraically independent polynomials in U.

Answer: Suppose trdeg(A) = trdeg(B). If x is “generic” then thevalues a(x) : a ∈ A determine a finite number of possibilitiesfor the entire collection b(x) : b ∈ B.I “Generic”: x lies in a particular full-measure set

31 / 39

Page 151: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Do polynomials generically determine other polynomials?

Say we have A ⊆ B ⊆ R[x]

I (Technically need to assume B is finitely generated)

Question: Do the values a(x) : a ∈ A generically determine thevalues b(x) : b ∈ B?

Definition: Polynomials f1, . . . , fm are algebraically independent ifthere is no P ∈ R[y1, . . . , ym] with P(f1, . . . , fm) ≡ 0.

Definition: For U ⊆ R[x], the transcendence degree trdeg(U) isthe number of algebraically independent polynomials in U.

Answer: Suppose trdeg(A) = trdeg(B). If x is “generic” then thevalues a(x) : a ∈ A determine a finite number of possibilitiesfor the entire collection b(x) : b ∈ B.I “Generic”: x lies in a particular full-measure set

31 / 39

Page 152: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Do polynomials generically determine other polynomials?

Say we have A ⊆ B ⊆ R[x]

I (Technically need to assume B is finitely generated)

Question: Do the values a(x) : a ∈ A generically determine thevalues b(x) : b ∈ B?

Definition: Polynomials f1, . . . , fm are algebraically independent ifthere is no P ∈ R[y1, . . . , ym] with P(f1, . . . , fm) ≡ 0.

Definition: For U ⊆ R[x], the transcendence degree trdeg(U) isthe number of algebraically independent polynomials in U.

Answer: Suppose trdeg(A) = trdeg(B). If x is “generic” then thevalues a(x) : a ∈ A determine a finite number of possibilitiesfor the entire collection b(x) : b ∈ B.I “Generic”: x lies in a particular full-measure set

31 / 39

Page 153: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Do polynomials generically determine other polynomials?

Say we have A ⊆ B ⊆ R[x]

I (Technically need to assume B is finitely generated)

Question: Do the values a(x) : a ∈ A generically determine thevalues b(x) : b ∈ B?

Definition: Polynomials f1, . . . , fm are algebraically independent ifthere is no P ∈ R[y1, . . . , ym] with P(f1, . . . , fm) ≡ 0.

Definition: For U ⊆ R[x], the transcendence degree trdeg(U) isthe number of algebraically independent polynomials in U.

Answer: Suppose trdeg(A) = trdeg(B). If x is “generic” then thevalues a(x) : a ∈ A determine a finite number of possibilitiesfor the entire collection b(x) : b ∈ B.I “Generic”: x lies in a particular full-measure set

31 / 39

Page 154: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

How to test algebraic independence?

This is actually easy!

Theorem (Jacobian criterion):Polynomials f1, . . . , fm ∈ R[x1, . . . , xp] are algebraicallyindependent if and only if the m × p Jacobian matrix Jij = ∂fi

∂xjhas

full row rank. (Still true if you evaluate J at a generic point x .)

I Why: Tests whether map (x1, . . . , xp) 7→ (f1(x), . . . , fm(x)) islocally surjective

32 / 39

Page 155: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

How to test algebraic independence?

This is actually easy!

Theorem (Jacobian criterion):Polynomials f1, . . . , fm ∈ R[x1, . . . , xp] are algebraicallyindependent if and only if the m × p Jacobian matrix Jij = ∂fi

∂xjhas

full row rank. (Still true if you evaluate J at a generic point x .)

I Why: Tests whether map (x1, . . . , xp) 7→ (f1(x), . . . , fm(x)) islocally surjective

32 / 39

Page 156: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

How to test algebraic independence?

This is actually easy!

Theorem (Jacobian criterion):Polynomials f1, . . . , fm ∈ R[x1, . . . , xp] are algebraicallyindependent if and only if the m × p Jacobian matrix Jij = ∂fi

∂xjhas

full row rank. (Still true if you evaluate J at a generic point x .)

I Why: Tests whether map (x1, . . . , xp) 7→ (f1(x), . . . , fm(x)) islocally surjective

32 / 39

Page 157: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

How to test algebraic independence?

This is actually easy!

Theorem (Jacobian criterion):Polynomials f1, . . . , fm ∈ R[x1, . . . , xp] are algebraicallyindependent if and only if the m × p Jacobian matrix Jij = ∂fi

∂xjhas

full row rank. (Still true if you evaluate J at a generic point x .)

I Why: Tests whether map (x1, . . . , xp) 7→ (f1(x), . . . , fm(x)) islocally surjective

32 / 39

Page 158: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Generic list recovery

Our main result is an efficient procedure that takes the problemsetup as input (group G and action on V ) and outputs the degreed∗ of invariants required for generic list recovery.

I List recovery: output a finite list x (1), x (2), . . ., one of which(approximately) lies in the orbit of the true x

I List recovery may be good enough in practice?

Procedure:

I Need to test whether R[x]G≤d determines R[x]G (generically)

I So need to check if trdeg(R[x]G≤d) = trdeg(R[x]G )

I trdeg(R[x]G ) is easy: dim(x)− dim(orbit)

I trdeg(R[x]G≤d) via Jacobian criterion

33 / 39

Page 159: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Generic list recovery

Our main result is an efficient procedure that takes the problemsetup as input (group G and action on V ) and outputs the degreed∗ of invariants required for generic list recovery.

I List recovery: output a finite list x (1), x (2), . . ., one of which(approximately) lies in the orbit of the true x

I List recovery may be good enough in practice?

Procedure:

I Need to test whether R[x]G≤d determines R[x]G (generically)

I So need to check if trdeg(R[x]G≤d) = trdeg(R[x]G )

I trdeg(R[x]G ) is easy: dim(x)− dim(orbit)

I trdeg(R[x]G≤d) via Jacobian criterion

33 / 39

Page 160: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Generic list recovery

Our main result is an efficient procedure that takes the problemsetup as input (group G and action on V ) and outputs the degreed∗ of invariants required for generic list recovery.

I List recovery: output a finite list x (1), x (2), . . ., one of which(approximately) lies in the orbit of the true x

I List recovery may be good enough in practice?

Procedure:

I Need to test whether R[x]G≤d determines R[x]G (generically)

I So need to check if trdeg(R[x]G≤d) = trdeg(R[x]G )

I trdeg(R[x]G ) is easy: dim(x)− dim(orbit)

I trdeg(R[x]G≤d) via Jacobian criterion

33 / 39

Page 161: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Generic list recovery

Our main result is an efficient procedure that takes the problemsetup as input (group G and action on V ) and outputs the degreed∗ of invariants required for generic list recovery.

I List recovery: output a finite list x (1), x (2), . . ., one of which(approximately) lies in the orbit of the true x

I List recovery may be good enough in practice?

Procedure:

I Need to test whether R[x]G≤d determines R[x]G (generically)

I So need to check if trdeg(R[x]G≤d) = trdeg(R[x]G )

I trdeg(R[x]G ) is easy: dim(x)− dim(orbit)

I trdeg(R[x]G≤d) via Jacobian criterion

33 / 39

Page 162: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Generic list recovery

Our main result is an efficient procedure that takes the problemsetup as input (group G and action on V ) and outputs the degreed∗ of invariants required for generic list recovery.

I List recovery: output a finite list x (1), x (2), . . ., one of which(approximately) lies in the orbit of the true x

I List recovery may be good enough in practice?

Procedure:

I Need to test whether R[x]G≤d determines R[x]G (generically)

I So need to check if trdeg(R[x]G≤d) = trdeg(R[x]G )

I trdeg(R[x]G ) is easy: dim(x)− dim(orbit)

I trdeg(R[x]G≤d) via Jacobian criterion

33 / 39

Page 163: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Generic list recovery

Our main result is an efficient procedure that takes the problemsetup as input (group G and action on V ) and outputs the degreed∗ of invariants required for generic list recovery.

I List recovery: output a finite list x (1), x (2), . . ., one of which(approximately) lies in the orbit of the true x

I List recovery may be good enough in practice?

Procedure:

I Need to test whether R[x]G≤d determines R[x]G (generically)

I So need to check if trdeg(R[x]G≤d) = trdeg(R[x]G )

I trdeg(R[x]G ) is easy: dim(x)− dim(orbit)

I trdeg(R[x]G≤d) via Jacobian criterion

33 / 39

Page 164: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Generic list recovery

Our main result is an efficient procedure that takes the problemsetup as input (group G and action on V ) and outputs the degreed∗ of invariants required for generic list recovery.

I List recovery: output a finite list x (1), x (2), . . ., one of which(approximately) lies in the orbit of the true x

I List recovery may be good enough in practice?

Procedure:

I Need to test whether R[x]G≤d determines R[x]G (generically)

I So need to check if trdeg(R[x]G≤d) = trdeg(R[x]G )

I trdeg(R[x]G ) is easy: dim(x)− dim(orbit)

I trdeg(R[x]G≤d) via Jacobian criterion

33 / 39

Page 165: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Generic list recovery

Our main result is an efficient procedure that takes the problemsetup as input (group G and action on V ) and outputs the degreed∗ of invariants required for generic list recovery.

I List recovery: output a finite list x (1), x (2), . . ., one of which(approximately) lies in the orbit of the true x

I List recovery may be good enough in practice?

Comments:

I For e.g. MRA (cyclic shifts), need to test each p separately ona computer

I Not an efficient algorithm to solve any particular instance

I There is also an algorithm to bound the size of the list (or testfor unique recovery), but it is not efficient (Grobner bases)

34 / 39

Page 166: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Generic list recovery

Our main result is an efficient procedure that takes the problemsetup as input (group G and action on V ) and outputs the degreed∗ of invariants required for generic list recovery.

I List recovery: output a finite list x (1), x (2), . . ., one of which(approximately) lies in the orbit of the true x

I List recovery may be good enough in practice?

Comments:

I For e.g. MRA (cyclic shifts), need to test each p separately ona computer

I Not an efficient algorithm to solve any particular instance

I There is also an algorithm to bound the size of the list (or testfor unique recovery), but it is not efficient (Grobner bases)

34 / 39

Page 167: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Generic list recovery

Our main result is an efficient procedure that takes the problemsetup as input (group G and action on V ) and outputs the degreed∗ of invariants required for generic list recovery.

I List recovery: output a finite list x (1), x (2), . . ., one of which(approximately) lies in the orbit of the true x

I List recovery may be good enough in practice?

Comments:

I For e.g. MRA (cyclic shifts), need to test each p separately ona computer

I Not an efficient algorithm to solve any particular instance

I There is also an algorithm to bound the size of the list (or testfor unique recovery), but it is not efficient (Grobner bases)

34 / 39

Page 168: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Generic list recovery

Our main result is an efficient procedure that takes the problemsetup as input (group G and action on V ) and outputs the degreed∗ of invariants required for generic list recovery.

I List recovery: output a finite list x (1), x (2), . . ., one of which(approximately) lies in the orbit of the true x

I List recovery may be good enough in practice?

Comments:

I For e.g. MRA (cyclic shifts), need to test each p separately ona computer

I Not an efficient algorithm to solve any particular instance

I There is also an algorithm to bound the size of the list (or testfor unique recovery), but it is not efficient (Grobner bases)

34 / 39

Page 169: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Generalized orbit recovery problem

Extensions:

I Projection (e.g. cryo-EM):I Observe yi = Π(gi · x) + εiI Π : V →W linearI εi ∼ N (0, σ2I)

I Heterogeneity:I K signals x (1), . . . , x (K)

I Mixing weights (w1, . . . ,wK ) ∈ ∆K

I Observe yi = Π(gi · x (ki )) + εiI ki ∼ 1, . . . ,K according to w

Same methods apply!

I Order-d moments now only give access to a particularsubspace of R[x]G

I For heterogeneity, work over a bigger group GK acting on(x (1), . . . , x (K)) ∈ V⊕K

35 / 39

Page 170: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Generalized orbit recovery problem

Extensions:

I Projection (e.g. cryo-EM):I Observe yi = Π(gi · x) + εiI Π : V →W linearI εi ∼ N (0, σ2I)

I Heterogeneity:I K signals x (1), . . . , x (K)

I Mixing weights (w1, . . . ,wK ) ∈ ∆K

I Observe yi = Π(gi · x (ki )) + εiI ki ∼ 1, . . . ,K according to w

Same methods apply!

I Order-d moments now only give access to a particularsubspace of R[x]G

I For heterogeneity, work over a bigger group GK acting on(x (1), . . . , x (K)) ∈ V⊕K

35 / 39

Page 171: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Generalized orbit recovery problem

Extensions:

I Projection (e.g. cryo-EM):I Observe yi = Π(gi · x) + εiI Π : V →W linearI εi ∼ N (0, σ2I)

I Heterogeneity:I K signals x (1), . . . , x (K)

I Mixing weights (w1, . . . ,wK ) ∈ ∆K

I Observe yi = Π(gi · x (ki )) + εiI ki ∼ 1, . . . ,K according to w

Same methods apply!

I Order-d moments now only give access to a particularsubspace of R[x]G

I For heterogeneity, work over a bigger group GK acting on(x (1), . . . , x (K)) ∈ V⊕K

35 / 39

Page 172: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Generalized orbit recovery problem

Extensions:

I Projection (e.g. cryo-EM):I Observe yi = Π(gi · x) + εiI Π : V →W linearI εi ∼ N (0, σ2I)

I Heterogeneity:I K signals x (1), . . . , x (K)

I Mixing weights (w1, . . . ,wK ) ∈ ∆K

I Observe yi = Π(gi · x (ki )) + εiI ki ∼ 1, . . . ,K according to w

Same methods apply!

I Order-d moments now only give access to a particularsubspace of R[x]G

I For heterogeneity, work over a bigger group GK acting on(x (1), . . . , x (K)) ∈ V⊕K

35 / 39

Page 173: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Generalized orbit recovery problem

Extensions:

I Projection (e.g. cryo-EM):I Observe yi = Π(gi · x) + εiI Π : V →W linearI εi ∼ N (0, σ2I)

I Heterogeneity:I K signals x (1), . . . , x (K)

I Mixing weights (w1, . . . ,wK ) ∈ ∆K

I Observe yi = Π(gi · x (ki )) + εiI ki ∼ 1, . . . ,K according to w

Same methods apply!

I Order-d moments now only give access to a particularsubspace of R[x]G

I For heterogeneity, work over a bigger group GK acting on(x (1), . . . , x (K)) ∈ V⊕K

35 / 39

Page 174: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Generalized orbit recovery problem

Extensions:

I Projection (e.g. cryo-EM):I Observe yi = Π(gi · x) + εiI Π : V →W linearI εi ∼ N (0, σ2I)

I Heterogeneity:I K signals x (1), . . . , x (K)

I Mixing weights (w1, . . . ,wK ) ∈ ∆K

I Observe yi = Π(gi · x (ki )) + εiI ki ∼ 1, . . . ,K according to w

Same methods apply!

I Order-d moments now only give access to a particularsubspace of R[x]G

I For heterogeneity, work over a bigger group GK acting on(x (1), . . . , x (K)) ∈ V⊕K

35 / 39

Page 175: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Results: cryo-EM

Our methods show that for cryo-EM, generic list recovery ispossible at degree 3

So information-theoretic sample complexity is Θ(σ6)

Ongoing work: polynomial time algorithm for cryo-EM

36 / 39

Page 176: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Results: cryo-EM

Our methods show that for cryo-EM, generic list recovery ispossible at degree 3

So information-theoretic sample complexity is Θ(σ6)

Ongoing work: polynomial time algorithm for cryo-EM

36 / 39

Page 177: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Results: cryo-EM

Our methods show that for cryo-EM, generic list recovery ispossible at degree 3

So information-theoretic sample complexity is Θ(σ6)

Ongoing work: polynomial time algorithm for cryo-EM

36 / 39

Page 178: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Efficient recovery: tensor decomposition

Restrict to finite group

Recall: with O(σ6) samples, can estimate the third moment:

T3(x) =∑g∈G

(g · x)⊗3

This is an instance of tensor decomposition: Given∑m

i=1 a⊗3i for

some a1, . . . , am ∈ Rp, recover ai

For MRA: since m ≤ p (“undercomplete”) can apply Jennrich’salgorithm to decompose tensor efficiently [1]

[1] Perry, Weed, Bandeira, Rigollet, Singer, The sample complexity of multi-reference alignment, 2017

37 / 39

Page 179: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Efficient recovery: tensor decomposition

Restrict to finite group

Recall: with O(σ6) samples, can estimate the third moment:

T3(x) =∑g∈G

(g · x)⊗3

This is an instance of tensor decomposition: Given∑m

i=1 a⊗3i for

some a1, . . . , am ∈ Rp, recover ai

For MRA: since m ≤ p (“undercomplete”) can apply Jennrich’salgorithm to decompose tensor efficiently [1]

[1] Perry, Weed, Bandeira, Rigollet, Singer, The sample complexity of multi-reference alignment, 2017

37 / 39

Page 180: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Efficient recovery: tensor decomposition

Restrict to finite group

Recall: with O(σ6) samples, can estimate the third moment:

T3(x) =∑g∈G

(g · x)⊗3

This is an instance of tensor decomposition: Given∑m

i=1 a⊗3i for

some a1, . . . , am ∈ Rp, recover ai

For MRA: since m ≤ p (“undercomplete”) can apply Jennrich’salgorithm to decompose tensor efficiently [1]

[1] Perry, Weed, Bandeira, Rigollet, Singer, The sample complexity of multi-reference alignment, 2017

37 / 39

Page 181: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Example: heterogeneous MRA

MRA with multiple signals x (1), . . . , x (K)

Td(x) =K∑

k=1

∑g∈G

(g · x (k))⊗d

Jennrich’s algorithm works if given 5th moment n = O(σ10) [1]

Information-theoretically, 3rd moment suffices if K ≤ p/6

If signals x (k) are random (i.i.d. Gaussian), conjectured thatefficient recovery is possible from 3rd moment iff K ≤ √p [2]

New result (with A. Moitra): if K ≤ √p/polylog(p) then forrandom signals, efficient recovery is possible from 3rd momentI Based on random overcomplete 3-tensor decomposition [3]

[1] Perry, Weed, Bandeira, Rigollet, Singer ’17

[2] Boumal, Bendory, Lederman, Singer ’17

[3] Ma, Shi, Steurer ’16

38 / 39

Page 182: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Example: heterogeneous MRA

MRA with multiple signals x (1), . . . , x (K)

Td(x) =K∑

k=1

∑g∈G

(g · x (k))⊗d

Jennrich’s algorithm works if given 5th moment n = O(σ10) [1]

Information-theoretically, 3rd moment suffices if K ≤ p/6

If signals x (k) are random (i.i.d. Gaussian), conjectured thatefficient recovery is possible from 3rd moment iff K ≤ √p [2]

New result (with A. Moitra): if K ≤ √p/polylog(p) then forrandom signals, efficient recovery is possible from 3rd momentI Based on random overcomplete 3-tensor decomposition [3]

[1] Perry, Weed, Bandeira, Rigollet, Singer ’17

[2] Boumal, Bendory, Lederman, Singer ’17

[3] Ma, Shi, Steurer ’16

38 / 39

Page 183: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Example: heterogeneous MRA

MRA with multiple signals x (1), . . . , x (K)

Td(x) =K∑

k=1

∑g∈G

(g · x (k))⊗d

Jennrich’s algorithm works if given 5th moment n = O(σ10) [1]

Information-theoretically, 3rd moment suffices if K ≤ p/6

If signals x (k) are random (i.i.d. Gaussian), conjectured thatefficient recovery is possible from 3rd moment iff K ≤ √p [2]

New result (with A. Moitra): if K ≤ √p/polylog(p) then forrandom signals, efficient recovery is possible from 3rd momentI Based on random overcomplete 3-tensor decomposition [3]

[1] Perry, Weed, Bandeira, Rigollet, Singer ’17

[2] Boumal, Bendory, Lederman, Singer ’17

[3] Ma, Shi, Steurer ’16

38 / 39

Page 184: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Example: heterogeneous MRA

MRA with multiple signals x (1), . . . , x (K)

Td(x) =K∑

k=1

∑g∈G

(g · x (k))⊗d

Jennrich’s algorithm works if given 5th moment n = O(σ10) [1]

Information-theoretically, 3rd moment suffices if K ≤ p/6

If signals x (k) are random (i.i.d. Gaussian), conjectured thatefficient recovery is possible from 3rd moment iff K ≤ √p [2]

New result (with A. Moitra): if K ≤ √p/polylog(p) then forrandom signals, efficient recovery is possible from 3rd momentI Based on random overcomplete 3-tensor decomposition [3]

[1] Perry, Weed, Bandeira, Rigollet, Singer ’17

[2] Boumal, Bendory, Lederman, Singer ’17

[3] Ma, Shi, Steurer ’16

38 / 39

Page 185: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Example: heterogeneous MRA

MRA with multiple signals x (1), . . . , x (K)

Td(x) =K∑

k=1

∑g∈G

(g · x (k))⊗d

Jennrich’s algorithm works if given 5th moment n = O(σ10) [1]

Information-theoretically, 3rd moment suffices if K ≤ p/6

If signals x (k) are random (i.i.d. Gaussian), conjectured thatefficient recovery is possible from 3rd moment iff K ≤ √p [2]

New result (with A. Moitra): if K ≤ √p/polylog(p) then forrandom signals, efficient recovery is possible from 3rd momentI Based on random overcomplete 3-tensor decomposition [3]

[1] Perry, Weed, Bandeira, Rigollet, Singer ’17

[2] Boumal, Bendory, Lederman, Singer ’17

[3] Ma, Shi, Steurer ’16

38 / 39

Page 186: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Acknowledgements

I Ankur Moitra

I Michel Goemans

I Philippe Rigollet

I Afonso Bandeira

I Collaborators

I Family

I Thank you!

39 / 39

Page 187: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Acknowledgements

I Ankur Moitra

I Michel Goemans

I Philippe Rigollet

I Afonso Bandeira

I Collaborators

I Family

I Thank you!

39 / 39

Page 188: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Acknowledgements

I Ankur Moitra

I Michel Goemans

I Philippe Rigollet

I Afonso Bandeira

I Collaborators

I Family

I Thank you!

39 / 39

Page 189: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Acknowledgements

I Ankur Moitra

I Michel Goemans

I Philippe Rigollet

I Afonso Bandeira

I Collaborators

I Family

I Thank you!

39 / 39

Page 190: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Acknowledgements

I Ankur Moitra

I Michel Goemans

I Philippe Rigollet

I Afonso Bandeira

I Collaborators

I Family

I Thank you!

39 / 39

Page 191: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Acknowledgements

I Ankur Moitra

I Michel Goemans

I Philippe Rigollet

I Afonso Bandeira

I Collaborators

I Family

I Thank you!

39 / 39

Page 192: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Acknowledgements

I Ankur Moitra

I Michel Goemans

I Philippe Rigollet

I Afonso Bandeira

I Collaborators

I Family

I Thank you!

39 / 39

Page 193: Statistical Estimation in the Presence of Group …aw128/thesis-slides.pdfI Group theory, representation theory, invariant theory IToday: problems involving group actions I A meeting

Acknowledgements

I Ankur Moitra

I Michel Goemans

I Philippe Rigollet

I Afonso Bandeira

I Collaborators

I Family

I Thank you!

39 / 39