sparse recovery and fourier samplingecprice/slides/thesis.pdf2 beyond: sparse recovery / compressive...

187
Sparse Recovery and Fourier Sampling Eric Price MIT Eric Price (MIT) Sparse Recovery and Fourier Sampling 1 / 37

Upload: others

Post on 26-Jun-2020

15 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Sparse Recovery and Fourier Sampling

Eric Price

MIT

Eric Price (MIT) Sparse Recovery and Fourier Sampling 1 / 37

Page 2: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

The Fourier TransformConversion between time and frequency domains

Time Domain Frequency Domain

Fourier Transform

Displacement of Air Concert A

Eric Price (MIT) Sparse Recovery and Fourier Sampling 2 / 37

Page 3: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

The Fourier Transform is Ubiquitous

Audio Video Medical Imaging

Radar GPS Oil Exploration

Eric Price (MIT) Sparse Recovery and Fourier Sampling 3 / 37

Page 4: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Computing the Discrete Fourier Transform

How to compute x = Fx?

Naive multiplication: O(n2).Fast Fourier Transform: O(n log n) time. [Cooley-Tukey, 1965]

[T]he method greatly reduces the tediousness of mechanicalcalculations.

– Carl Friedrich Gauss, 1805

By hand: 22n log n seconds. [Danielson-Lanczos, 1942]Can we do better?

When can we compute the FourierTransform in sublinear time?

Eric Price (MIT) Sparse Recovery and Fourier Sampling 4 / 37

Page 5: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Computing the Discrete Fourier Transform

How to compute x = Fx?Naive multiplication: O(n2).

Fast Fourier Transform: O(n log n) time. [Cooley-Tukey, 1965]

[T]he method greatly reduces the tediousness of mechanicalcalculations.

– Carl Friedrich Gauss, 1805

By hand: 22n log n seconds. [Danielson-Lanczos, 1942]Can we do better?

When can we compute the FourierTransform in sublinear time?

Eric Price (MIT) Sparse Recovery and Fourier Sampling 4 / 37

Page 6: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Computing the Discrete Fourier Transform

How to compute x = Fx?Naive multiplication: O(n2).Fast Fourier Transform: O(n log n) time. [Cooley-Tukey, 1965]

[T]he method greatly reduces the tediousness of mechanicalcalculations.

– Carl Friedrich Gauss, 1805

By hand: 22n log n seconds. [Danielson-Lanczos, 1942]Can we do better?

When can we compute the FourierTransform in sublinear time?

Eric Price (MIT) Sparse Recovery and Fourier Sampling 4 / 37

Page 7: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Computing the Discrete Fourier Transform

How to compute x = Fx?Naive multiplication: O(n2).Fast Fourier Transform: O(n log n) time. [Cooley-Tukey, 1965]

[T]he method greatly reduces the tediousness of mechanicalcalculations.

– Carl Friedrich Gauss, 1805

By hand: 22n log n seconds. [Danielson-Lanczos, 1942]Can we do better?

When can we compute the FourierTransform in sublinear time?

Eric Price (MIT) Sparse Recovery and Fourier Sampling 4 / 37

Page 8: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Computing the Discrete Fourier Transform

How to compute x = Fx?Naive multiplication: O(n2).Fast Fourier Transform: O(n log n) time. [Cooley-Tukey, 1965]

[T]he method greatly reduces the tediousness of mechanicalcalculations.

– Carl Friedrich Gauss, 1805

By hand: 22n log n seconds. [Danielson-Lanczos, 1942]

Can we do better?

When can we compute the FourierTransform in sublinear time?

Eric Price (MIT) Sparse Recovery and Fourier Sampling 4 / 37

Page 9: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Computing the Discrete Fourier Transform

How to compute x = Fx?Naive multiplication: O(n2).Fast Fourier Transform: O(n log n) time. [Cooley-Tukey, 1965]

[T]he method greatly reduces the tediousness of mechanicalcalculations.

– Carl Friedrich Gauss, 1805

By hand: 22n log n seconds. [Danielson-Lanczos, 1942]Can we do better?

When can we compute the FourierTransform in sublinear time?

Eric Price (MIT) Sparse Recovery and Fourier Sampling 4 / 37

Page 10: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Computing the Discrete Fourier Transform

How to compute x = Fx?Naive multiplication: O(n2).Fast Fourier Transform: O(n log n) time. [Cooley-Tukey, 1965]

[T]he method greatly reduces the tediousness of mechanicalcalculations.

– Carl Friedrich Gauss, 1805

By hand: 22n log n seconds. [Danielson-Lanczos, 1942]Can we do much better?

When can we compute the FourierTransform in sublinear time?

Eric Price (MIT) Sparse Recovery and Fourier Sampling 4 / 37

Page 11: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Computing the Discrete Fourier Transform

How to compute x = Fx?Naive multiplication: O(n2).Fast Fourier Transform: O(n log n) time. [Cooley-Tukey, 1965]

[T]he method greatly reduces the tediousness of mechanicalcalculations.

– Carl Friedrich Gauss, 1805

By hand: 22n log n seconds. [Danielson-Lanczos, 1942]Can we do much better?

When can we compute the FourierTransform in sublinear time?

Eric Price (MIT) Sparse Recovery and Fourier Sampling 4 / 37

Page 12: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Idea: Leverage SparsityOften the Fourier transform is dominated by a small number of peaks:

Time Signal Frequency(Exactly sparse)

Frequency(Approximately sparse)

Sparsity is common:

Audio Video MedicalImaging

Radar GPS Oil Exploration

Goal of this work: a sparse Fourier transformFaster Fourier Transform on sparse data.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 5 / 37

Page 13: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Idea: Leverage SparsityOften the Fourier transform is dominated by a small number of peaks:

Time Signal Frequency(Exactly sparse)

Frequency(Approximately sparse)

Sparsity is common:

Audio Video MedicalImaging

Radar GPS Oil Exploration

Goal of this work: a sparse Fourier transformFaster Fourier Transform on sparse data.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 5 / 37

Page 14: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Idea: Leverage SparsityOften the Fourier transform is dominated by a small number of peaks:

Time Signal Frequency(Exactly sparse)

Frequency(Approximately sparse)

Sparsity is common:

Audio Video MedicalImaging

Radar GPS Oil Exploration

Goal of this work: a sparse Fourier transformFaster Fourier Transform on sparse data.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 5 / 37

Page 15: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Talk Outline

1 Sparse Fourier TransformOverviewTechnical Details

2 Beyond: Sparse Recovery / Compressive SensingOverviewAdaptivityConclusion

Eric Price (MIT) Sparse Recovery and Fourier Sampling 6 / 37

Page 16: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Talk Outline

1 Sparse Fourier TransformOverviewTechnical Details

2 Beyond: Sparse Recovery / Compressive SensingOverviewAdaptivityConclusion

Eric Price (MIT) Sparse Recovery and Fourier Sampling 6 / 37

Page 17: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Talk Outline

1 Sparse Fourier TransformOverviewTechnical Details

2 Beyond: Sparse Recovery / Compressive SensingOverviewAdaptivityConclusion

Eric Price (MIT) Sparse Recovery and Fourier Sampling 7 / 37

Page 18: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

My Contributions

Goal: Compute the Fourier transform x = Fx when x is k -sparse.

Theory:I The fastest algorithm for Fourier transforms of sparse data.I The only algorithms faster than FFT for all k = o(n).

Practice:I Implementation is faster than FFTW for a wide range of inputs.I Orders of magnitude faster than previous sparse Fourier transforms.I Useful in multiple applications.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 8 / 37

Page 19: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

My Contributions

Goal: Compute the Fourier transform x = Fx when x is k -sparse.

Theory:I The fastest algorithm for Fourier transforms of sparse data.I The only algorithms faster than FFT for all k = o(n).

Practice:I Implementation is faster than FFTW for a wide range of inputs.I Orders of magnitude faster than previous sparse Fourier transforms.I Useful in multiple applications.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 8 / 37

Page 20: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Applications of ideashttp://groups.csail.mit.edu/netmit/sFFT/workshop.html

GPS [HAKI]: 2× faster

Spectrum sensing [HSAHK]: 6× lower sampling rateDense FFT over clusters [TPKP]: 2× faster...

Eric Price (MIT) Sparse Recovery and Fourier Sampling 9 / 37

Page 21: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Applications of ideashttp://groups.csail.mit.edu/netmit/sFFT/workshop.html

GPS [HAKI]: 2× fasterSpectrum sensing [HSAHK]: 6× lower sampling rate

Dense FFT over clusters [TPKP]: 2× faster...

Eric Price (MIT) Sparse Recovery and Fourier Sampling 9 / 37

Page 22: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Applications of ideashttp://groups.csail.mit.edu/netmit/sFFT/workshop.html

GPS [HAKI]: 2× fasterSpectrum sensing [HSAHK]: 6× lower sampling rateDense FFT over clusters [TPKP]: 2× faster

...

Eric Price (MIT) Sparse Recovery and Fourier Sampling 9 / 37

Page 23: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Applications of ideashttp://groups.csail.mit.edu/netmit/sFFT/workshop.html

GPS [HAKI]: 2× fasterSpectrum sensing [HSAHK]: 6× lower sampling rateDense FFT over clusters [TPKP]: 2× faster...

Eric Price (MIT) Sparse Recovery and Fourier Sampling 9 / 37

Page 24: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Talk Outline

1 Sparse Fourier TransformOverviewTechnical Details

2 Beyond: Sparse Recovery / Compressive SensingOverviewAdaptivityConclusion

Eric Price (MIT) Sparse Recovery and Fourier Sampling 10 / 37

Page 25: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Theoretical ResultsFor a signal of size n with k large frequencies

First on Boolean cube [GL89, KM92, L93]

Adapted to complexes [Mansour ’92, GGIMS02, AGS03, GMS05,Iwen ’10, Akavia ’10]

I All take at least k log4 n time.I Only better than FFT if k n/ log3 n.

Our results [HIKP12a, HIKP12b]

I Exactly k -sparse: O(k log n)F Optimal if FFT is optimal.

I Approximately k -sparse: O(k log(n/k) log n)

‖result − x‖2 6 (1 + ε) mink -sparse x(k)

‖x(k) − x‖2

I Better than FFT for any k = o(n)

Eric Price (MIT) Sparse Recovery and Fourier Sampling 11 / 37

Page 26: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Theoretical ResultsFor a signal of size n with k large frequencies

First on Boolean cube [GL89, KM92, L93]Adapted to complexes [Mansour ’92, GGIMS02, AGS03, GMS05,Iwen ’10, Akavia ’10]

I All take at least k log4 n time.I Only better than FFT if k n/ log3 n.

Our results [HIKP12a, HIKP12b]

I Exactly k -sparse: O(k log n)F Optimal if FFT is optimal.

I Approximately k -sparse: O(k log(n/k) log n)

‖result − x‖2 6 (1 + ε) mink -sparse x(k)

‖x(k) − x‖2

I Better than FFT for any k = o(n)

Eric Price (MIT) Sparse Recovery and Fourier Sampling 11 / 37

Page 27: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Theoretical ResultsFor a signal of size n with k large frequencies

First on Boolean cube [GL89, KM92, L93]Adapted to complexes [Mansour ’92, GGIMS02, AGS03, GMS05,Iwen ’10, Akavia ’10]

I All take at least k log4 n time.I Only better than FFT if k n/ log3 n.

Our results [HIKP12a, HIKP12b]

I Exactly k -sparse: O(k log n)F Optimal if FFT is optimal.

I Approximately k -sparse: O(k log(n/k) log n)

‖result − x‖2 6 (1 + ε) mink -sparse x(k)

‖x(k) − x‖2

I Better than FFT for any k = o(n)

Eric Price (MIT) Sparse Recovery and Fourier Sampling 11 / 37

Page 28: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Theoretical ResultsFor a signal of size n with k large frequencies

First on Boolean cube [GL89, KM92, L93]Adapted to complexes [Mansour ’92, GGIMS02, AGS03, GMS05,Iwen ’10, Akavia ’10]

I All take at least k log4 n time.I Only better than FFT if k n/ log3 n.

Our results [HIKP12a, HIKP12b]

I Exactly k -sparse: O(k log n)F Optimal if FFT is optimal.

I Approximately k -sparse: O(k log(n/k) log n)

‖result − x‖2 6 (1 + ε) mink -sparse x(k)

‖x(k) − x‖2

I Better than FFT for any k = o(n)

Eric Price (MIT) Sparse Recovery and Fourier Sampling 11 / 37

Page 29: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Theoretical ResultsFor a signal of size n with k large frequencies

First on Boolean cube [GL89, KM92, L93]Adapted to complexes [Mansour ’92, GGIMS02, AGS03, GMS05,Iwen ’10, Akavia ’10]

I All take at least k log4 n time.I Only better than FFT if k n/ log3 n.

Our results [HIKP12a, HIKP12b]I Exactly k -sparse: O(k log n)

F Optimal if FFT is optimal.

I Approximately k -sparse: O(k log(n/k) log n)

‖result − x‖2 6 (1 + ε) mink -sparse x(k)

‖x(k) − x‖2

I Better than FFT for any k = o(n)

Eric Price (MIT) Sparse Recovery and Fourier Sampling 11 / 37

Page 30: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Theoretical ResultsFor a signal of size n with k large frequencies

First on Boolean cube [GL89, KM92, L93]Adapted to complexes [Mansour ’92, GGIMS02, AGS03, GMS05,Iwen ’10, Akavia ’10]

I All take at least k log4 n time.I Only better than FFT if k n/ log3 n.

Our results [HIKP12a, HIKP12b]I Exactly k -sparse: O(k log n)

F Optimal if FFT is optimal.I Approximately k -sparse: O(k log(n/k) log n)

‖result − x‖2 6 (1 + ε) mink -sparse x(k)

‖x(k) − x‖2

I Better than FFT for any k = o(n)

Eric Price (MIT) Sparse Recovery and Fourier Sampling 11 / 37

Page 31: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Theoretical ResultsFor a signal of size n with k large frequencies

First on Boolean cube [GL89, KM92, L93]Adapted to complexes [Mansour ’92, GGIMS02, AGS03, GMS05,Iwen ’10, Akavia ’10]

I All take at least k log4 n time.I Only better than FFT if k n/ log3 n.

Our results [HIKP12a, HIKP12b]I Exactly k -sparse: O(k log n)

F Optimal if FFT is optimal.I Approximately k -sparse: O(k log(n/k) log n)

‖result − x‖2 6 (1 + ε) mink -sparse x(k)

‖x(k) − x‖2

I Better than FFT for any k = o(n)

Eric Price (MIT) Sparse Recovery and Fourier Sampling 11 / 37

Page 32: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Discrete Fourier Transform (DFT) Definition

Given x ∈ Cn, compute Fourier transform x :

xi =1n

∑j

ω−ijxj for ω = eτi/n

x = F x for Fij = ω−ij/n

Inverse transform almost identical:

xi =∑

j

ωij xj

I ω→ ω−1, scale

Lots of nice properties

I Convolution←→ Multiplication

(where τ is the circle constant 6.283...)

Eric Price (MIT) Sparse Recovery and Fourier Sampling 12 / 37

Page 33: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Discrete Fourier Transform (DFT) Definition

Given x ∈ Cn, compute Fourier transform x :

xi =1n

∑j

ω−ijxj for ω = eτi/n

x = F x for Fij = ω−ij/n

Inverse transform almost identical:

xi =∑

j

ωij xj

I ω→ ω−1, scale

Lots of nice properties

I Convolution←→ Multiplication

(where τ is the circle constant 6.283...)

Eric Price (MIT) Sparse Recovery and Fourier Sampling 12 / 37

Page 34: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Discrete Fourier Transform (DFT) Definition

Given x ∈ Cn, compute Fourier transform x :

xi =1n

∑j

ω−ijxj for ω = eτi/n

x = F x for Fij = ω−ij/n

Inverse transform almost identical:

xi =∑

j

ωij xj

I ω→ ω−1, scale

Lots of nice properties

I Convolution←→ Multiplication

(where τ is the circle constant 6.283...)

Eric Price (MIT) Sparse Recovery and Fourier Sampling 12 / 37

Page 35: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Discrete Fourier Transform (DFT) Definition

Given x ∈ Cn, compute Fourier transform x :

xi =1n

∑j

ω−ijxj for ω = eτi/n

x = F x for Fij = ω−ij/n

Inverse transform almost identical:

xi =∑

j

ωij xj

I ω→ ω−1, scaleLots of nice properties

I Convolution←→ Multiplication

(where τ is the circle constant 6.283...)

Eric Price (MIT) Sparse Recovery and Fourier Sampling 12 / 37

Page 36: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Discrete Fourier Transform (DFT) Definition

Given x ∈ Cn, compute Fourier transform x :

xi =1n

∑j

ω−ijxj for ω = eτi/n

x = F x for Fij = ω−ij/n

Inverse transform almost identical:

xi =∑

j

ωij xj

I ω→ ω−1, scale

Lots of nice properties

I Convolution←→ Multiplication

(where τ is the circle constant 6.283...)

Eric Price (MIT) Sparse Recovery and Fourier Sampling 12 / 37

Page 37: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Discrete Fourier Transform (DFT) Definition

Given x ∈ Cn, compute Fourier transform x :

xi =1n

∑j

ω−ijxj for ω = eτi/n

x = F x for Fij = ω−ij/n

Inverse transform almost identical:

xi =∑

j

ωij xj

I ω→ ω−1, scaleLots of nice properties

I Convolution←→ Multiplication

(where τ is the circle constant 6.283...)

Eric Price (MIT) Sparse Recovery and Fourier Sampling 12 / 37

Page 38: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Discrete Fourier Transform (DFT) Definition

Given x ∈ Cn, compute Fourier transform x :

xi =1n

∑j

ω−ijxj for ω = eτi/n

x = F x for Fij = ω−ij/n

Inverse transform almost identical:

xi =∑

j

ωij xj

I ω→ ω−1, scaleLots of nice properties

I Convolution←→ Multiplication

(where τ is the circle constant 6.283...)

Eric Price (MIT) Sparse Recovery and Fourier Sampling 12 / 37

Page 39: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm

Simpler case: x is exactly k -sparse.

TheoremWe can compute x in O(k log n) expected time.

Still kind of hard.

Simplest case: x is exactly 1-sparse.

LemmaWe can compute a 1-sparse x in O(1) time.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 13 / 37

Page 40: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm

Simpler case: x is exactly k -sparse.

TheoremWe can compute x in O(k log n) expected time.

Still kind of hard.

Simplest case: x is exactly 1-sparse.

LemmaWe can compute a 1-sparse x in O(1) time.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 13 / 37

Page 41: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm

Simpler case: x is exactly k -sparse.

TheoremWe can compute x in O(k log n) expected time.

Still kind of hard.

Simplest case: x is exactly 1-sparse.

LemmaWe can compute a 1-sparse x in O(1) time.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 13 / 37

Page 42: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm

Simpler case: x is exactly k -sparse.

TheoremWe can compute x in O(k log n) expected time.

Still kind of hard.

Simplest case: x is exactly 1-sparse.

LemmaWe can compute a 1-sparse x in O(1) time.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 13 / 37

Page 43: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm

Simpler case: x is exactly k -sparse.

TheoremWe can compute x in O(k log n) expected time.

Still kind of hard.

Simplest case: x is exactly 1-sparse.

LemmaWe can compute a 1-sparse x in O(1) time.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 13 / 37

Page 44: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for k = 1

x :

t

aLemmaWe can compute a 1-sparse x in O(1) time.

xi =

a if i = t0 otherwise

Then x = (a,aωt ,aω2t ,aω3t , . . . ,aω(n−1)t).

x0 = a x1 = aωt

x1/x0 = ωt =⇒ t .

(Related to OFDM, Prony’s method, matrix pencil.)

Eric Price (MIT) Sparse Recovery and Fourier Sampling 14 / 37

Page 45: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for k = 1

x :

t

aLemmaWe can compute a 1-sparse x in O(1) time.

xi =

a if i = t0 otherwise

Then x = (a,aωt ,aω2t ,aω3t , . . . ,aω(n−1)t).

x0 = a x1 = aωt

x1/x0 = ωt =⇒ t .

(Related to OFDM, Prony’s method, matrix pencil.)

Eric Price (MIT) Sparse Recovery and Fourier Sampling 14 / 37

Page 46: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for k = 1

x :

t

aLemmaWe can compute a 1-sparse x in O(1) time.

xi =

a if i = t0 otherwise

Then x = (a,aωt ,aω2t ,aω3t , . . . ,aω(n−1)t).

x0 = a

x1 = aωt

x1/x0 = ωt =⇒ t .

(Related to OFDM, Prony’s method, matrix pencil.)

Eric Price (MIT) Sparse Recovery and Fourier Sampling 14 / 37

Page 47: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for k = 1

x :

t

aLemmaWe can compute a 1-sparse x in O(1) time.

xi =

a if i = t0 otherwise

Then x = (a,aωt ,aω2t ,aω3t , . . . ,aω(n−1)t).

x0 = a x1 = aωt

x1/x0 = ωt =⇒ t .

(Related to OFDM, Prony’s method, matrix pencil.)

Eric Price (MIT) Sparse Recovery and Fourier Sampling 14 / 37

Page 48: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for k = 1

x :

t

aLemmaWe can compute a 1-sparse x in O(1) time.

xi =

a if i = t0 otherwise

Then x = (a,aωt ,aω2t ,aω3t , . . . ,aω(n−1)t).

x0 = a x1 = aωt

x1/x0 = ωt =⇒ t .

(Related to OFDM, Prony’s method, matrix pencil.)

Eric Price (MIT) Sparse Recovery and Fourier Sampling 14 / 37

Page 49: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for k = 1

x :

t

aLemmaWe can compute a 1-sparse x in O(1) time.

xi =

a if i = t0 otherwise

Then x = (a,aωt ,aω2t ,aω3t , . . . ,aω(n−1)t).

x0 = a x1 = aωt

x1/x0 = ωt =⇒ t .

(Related to OFDM, Prony’s method, matrix pencil.)

Eric Price (MIT) Sparse Recovery and Fourier Sampling 14 / 37

Page 50: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for k = 1

x :

t

aLemmaWe can compute a 1-sparse x in O(1) time.

xi =

a if i = t0 otherwise

Then x = (a,aωt ,aω2t ,aω3t , . . . ,aω(n−1)t).

x0 = a x1 = aωt

x1/x0 = ωt =⇒ t .

(Related to OFDM, Prony’s method, matrix pencil.)

Eric Price (MIT) Sparse Recovery and Fourier Sampling 14 / 37

Page 51: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for general k

Reduce general k to k = 1.

“Filters”: partition frequencies intoO(k) buckets.

I Sample from time domain of eachbucket with O(log n) overhead.

I Recovered by k = 1 algorithm

Most frequencies alone in bucket.Random permutation

Permute Filters O(k)

1-sparse recovery

1-sparse recovery

1-sparse recovery

1-sparse recovery

x x ′

Recovers most of x :

Lemma (Partial sparse recovery)In O(k log n) expected time, we can compute an estimate x ′ such thatx − x ′ is k/2-sparse.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 15 / 37

Page 52: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for general k

Reduce general k to k = 1.“Filters”: partition frequencies intoO(k) buckets.

I Sample from time domain of eachbucket with O(log n) overhead.

I Recovered by k = 1 algorithm

Most frequencies alone in bucket.Random permutation

Permute Filters O(k)

1-sparse recovery

1-sparse recovery

1-sparse recovery

1-sparse recovery

x x ′

Recovers most of x :

Lemma (Partial sparse recovery)In O(k log n) expected time, we can compute an estimate x ′ such thatx − x ′ is k/2-sparse.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 15 / 37

Page 53: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for general k

Reduce general k to k = 1.“Filters”: partition frequencies intoO(k) buckets.

I Sample from time domain of eachbucket with O(log n) overhead.

I Recovered by k = 1 algorithm

Most frequencies alone in bucket.Random permutation

Permute Filters O(k)

1-sparse recovery

1-sparse recovery

1-sparse recovery

1-sparse recovery

x x ′

Recovers most of x :

Lemma (Partial sparse recovery)In O(k log n) expected time, we can compute an estimate x ′ such thatx − x ′ is k/2-sparse.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 15 / 37

Page 54: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for general k

Reduce general k to k = 1.“Filters”: partition frequencies intoO(k) buckets.

I Sample from time domain of eachbucket with O(log n) overhead.

I Recovered by k = 1 algorithm

Most frequencies alone in bucket.Random permutation

Permute Filters O(k)

1-sparse recovery

1-sparse recovery

1-sparse recovery

1-sparse recovery

x x ′

Recovers most of x :

Lemma (Partial sparse recovery)In O(k log n) expected time, we can compute an estimate x ′ such thatx − x ′ is k/2-sparse.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 15 / 37

Page 55: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for general k

Reduce general k to k = 1.“Filters”: partition frequencies intoO(k) buckets.

I Sample from time domain of eachbucket with O(log n) overhead.

I Recovered by k = 1 algorithm

Most frequencies alone in bucket.Random permutation

Permute Filters O(k)

1-sparse recovery

1-sparse recovery

1-sparse recovery

1-sparse recovery

x x ′

Recovers most of x :

Lemma (Partial sparse recovery)In O(k log n) expected time, we can compute an estimate x ′ such thatx − x ′ is k/2-sparse.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 15 / 37

Page 56: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for general k

Reduce general k to k = 1.“Filters”: partition frequencies intoO(k) buckets.

I Sample from time domain of eachbucket with O(log n) overhead.

I Recovered by k = 1 algorithm

Most frequencies alone in bucket.Random permutation

Permute Filters O(k)

1-sparse recovery

1-sparse recovery

1-sparse recovery

1-sparse recovery

x x ′

Recovers most of x :

Lemma (Partial sparse recovery)In O(k log n) expected time, we can compute an estimate x ′ such thatx − x ′ is k/2-sparse.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 15 / 37

Page 57: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for general k

Reduce general k to k = 1.“Filters”: partition frequencies intoO(k) buckets.

I Sample from time domain of eachbucket with O(log n) overhead.

I Recovered by k = 1 algorithm

Most frequencies alone in bucket.

Random permutation

Permute Filters O(k)

1-sparse recovery

1-sparse recovery

1-sparse recovery

1-sparse recovery

x x ′

Recovers most of x :

Lemma (Partial sparse recovery)In O(k log n) expected time, we can compute an estimate x ′ such thatx − x ′ is k/2-sparse.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 15 / 37

Page 58: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for general k

Reduce general k to k = 1.“Filters”: partition frequencies intoO(k) buckets.

I Sample from time domain of eachbucket with O(log n) overhead.

I Recovered by k = 1 algorithm

Most frequencies alone in bucket.

Random permutation

Permute Filters O(k)

1-sparse recovery

1-sparse recovery

1-sparse recovery

1-sparse recovery

x x ′

Recovers most of x :

Lemma (Partial sparse recovery)In O(k log n) expected time, we can compute an estimate x ′ such thatx − x ′ is k/2-sparse.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 15 / 37

Page 59: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for general k

Reduce general k to k = 1.“Filters”: partition frequencies intoO(k) buckets.

I Sample from time domain of eachbucket with O(log n) overhead.

I Recovered by k = 1 algorithm

Most frequencies alone in bucket.

Random permutation

Permute Filters O(k)

1-sparse recovery

1-sparse recovery

1-sparse recovery

1-sparse recovery

x x ′

Recovers most of x :

Lemma (Partial sparse recovery)In O(k log n) expected time, we can compute an estimate x ′ such thatx − x ′ is k/2-sparse.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 15 / 37

Page 60: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for general k

Reduce general k to k = 1.“Filters”: partition frequencies intoO(k) buckets.

I Sample from time domain of eachbucket with O(log n) overhead.

I Recovered by k = 1 algorithm

Most frequencies alone in bucket.

Random permutation

Permute Filters O(k)

1-sparse recovery

1-sparse recovery

1-sparse recovery

1-sparse recovery

x x ′

Recovers most of x :

Lemma (Partial sparse recovery)In O(k log n) expected time, we can compute an estimate x ′ such thatx − x ′ is k/2-sparse.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 15 / 37

Page 61: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for general k

Reduce general k to k = 1.“Filters”: partition frequencies intoO(k) buckets.

I Sample from time domain of eachbucket with O(log n) overhead.

I Recovered by k = 1 algorithm

Most frequencies alone in bucket.Random permutation

Permute Filters O(k)

1-sparse recovery

1-sparse recovery

1-sparse recovery

1-sparse recovery

x x ′

Recovers most of x :

Lemma (Partial sparse recovery)In O(k log n) expected time, we can compute an estimate x ′ such thatx − x ′ is k/2-sparse.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 15 / 37

Page 62: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for general k

Reduce general k to k = 1.“Filters”: partition frequencies intoO(k) buckets.

I Sample from time domain of eachbucket with O(log n) overhead.

I Recovered by k = 1 algorithm

Most frequencies alone in bucket.Random permutation

Permute Filters O(k)

1-sparse recovery

1-sparse recovery

1-sparse recovery

1-sparse recovery

x x ′

Recovers most of x :

Lemma (Partial sparse recovery)In O(k log n) expected time, we can compute an estimate x ′ such thatx − x ′ is k/2-sparse.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 15 / 37

Page 63: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for general k

Reduce general k to k = 1.“Filters”: partition frequencies intoO(k) buckets.

I Sample from time domain of eachbucket with O(log n) overhead.

I Recovered by k = 1 algorithm

Most frequencies alone in bucket.Random permutation

Permute Filters O(k)

1-sparse recovery

1-sparse recovery

1-sparse recovery

1-sparse recovery

x x ′

Recovers most of x :

Lemma (Partial sparse recovery)In O(k log n) expected time, we can compute an estimate x ′ such thatx − x ′ is k/2-sparse.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 15 / 37

Page 64: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for general k

Reduce general k to k = 1.“Filters”: partition frequencies intoO(k) buckets.

I Sample from time domain of eachbucket with O(log n) overhead.

I Recovered by k = 1 algorithm

Most frequencies alone in bucket.Random permutation

Permute Filters O(k)

1-sparse recovery

1-sparse recovery

1-sparse recovery

1-sparse recovery

x x ′

Recovers most of x :

Lemma (Partial sparse recovery)In O(k log n) expected time, we can compute an estimate x ′ such thatx − x ′ is k/2-sparse.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 15 / 37

Page 65: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Overall outline x

Permute Filters O(k)

1-sparse recovery

1-sparse recovery

1-sparse recovery

1-sparse recovery

Partial k -sparse recovery

x x ′

Lemma (Partial sparse recovery)In O(k log n) expected time, we can compute an estimate x ′ such thatx − x ′ is k/2-sparse.

Repeat, k → k/2→ k/4→ · · ·

TheoremWe can compute x in O(k log n) expected time.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 16 / 37

Page 66: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Overall outline x − x ′

Permute Filters O(k)

1-sparse recovery

1-sparse recovery

1-sparse recovery

1-sparse recovery

Partial k -sparse recovery

x x ′

Lemma (Partial sparse recovery)In O(k log n) expected time, we can compute an estimate x ′ such thatx − x ′ is k/2-sparse.

Repeat, k → k/2→ k/4→ · · ·

TheoremWe can compute x in O(k log n) expected time.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 16 / 37

Page 67: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Overall outline x − x ′

Permute Filters O(k)

1-sparse recovery

1-sparse recovery

1-sparse recovery

1-sparse recovery

Partial k -sparse recovery

x x ′

Lemma (Partial sparse recovery)In O(k log n) expected time, we can compute an estimate x ′ such thatx − x ′ is k/2-sparse.

Repeat, k → k/2→ k/4→ · · ·

TheoremWe can compute x in O(k log n) expected time.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 16 / 37

Page 68: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Overall outline x − x ′

Permute Filters O(k)

1-sparse recovery

1-sparse recovery

1-sparse recovery

1-sparse recovery

Partial k -sparse recovery

x x ′

Lemma (Partial sparse recovery)In O(k log n) expected time, we can compute an estimate x ′ such thatx − x ′ is k/2-sparse.

Repeat, k → k/2→ k/4→ · · ·

TheoremWe can compute x in O(k log n) expected time.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 16 / 37

Page 69: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Overall outline x − x ′

Permute Filters O(k)

1-sparse recovery

1-sparse recovery

1-sparse recovery

1-sparse recovery

Partial k -sparse recovery

x x ′

Lemma (Partial sparse recovery)In O(k log n) expected time, we can compute an estimate x ′ such thatx − x ′ is k/2-sparse.

Repeat, k → k/2→ k/4→ · · ·

TheoremWe can compute x in O(k log n) expected time.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 16 / 37

Page 70: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Overall outline x − x ′

Permute Filters O(k)

1-sparse recovery

1-sparse recovery

1-sparse recovery

1-sparse recovery

Partial k -sparse recovery

x x ′

Lemma (Partial sparse recovery)In O(k log n) expected time, we can compute an estimate x ′ such thatx − x ′ is k/2-sparse.

Repeat, k → k/2→ k/4→ · · ·

TheoremWe can compute x in O(k log n) expected time.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 16 / 37

Page 71: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Overall outline x − x ′

Permute Filters O(k)

1-sparse recovery

1-sparse recovery

1-sparse recovery

1-sparse recovery

Partial k -sparse recovery

x x ′

Lemma (Partial sparse recovery)In O(k log n) expected time, we can compute an estimate x ′ such thatx − x ′ is k/2-sparse.

Repeat, k → k/2→ k/4→ · · ·

TheoremWe can compute x in O(k log n) expected time.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 16 / 37

Page 72: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Overall outline x − x ′

Permute Filters O(k)

1-sparse recovery

1-sparse recovery

1-sparse recovery

1-sparse recovery

Partial k -sparse recovery

x x ′

Lemma (Partial sparse recovery)In O(k log n) expected time, we can compute an estimate x ′ such thatx − x ′ is k/2-sparse.

Repeat, k → k/2→ k/4→ · · ·

TheoremWe can compute x in O(k log n) expected time.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 16 / 37

Page 73: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

How can you isolate frequencies?Time Frequency

×

=

=

Eric Price (MIT) Sparse Recovery and Fourier Sampling 17 / 37

n-dimensional DFT:O(n log n)x → x

n-dimensional DFT of firstk terms: O(n log n)x · rect→ x ∗ sinc.

k -dimensional DFT offirst k terms: O(B log B)alias(x · rect)→subsample(x ∗ sinc).

Page 74: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

How can you isolate frequencies?Time Frequency

×

=

=

Eric Price (MIT) Sparse Recovery and Fourier Sampling 17 / 37

n-dimensional DFT:O(n log n)x → x

n-dimensional DFT of firstk terms: O(n log n)x · rect→ x ∗ sinc.

k -dimensional DFT offirst k terms: O(B log B)alias(x · rect)→subsample(x ∗ sinc).

Page 75: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

How can you isolate frequencies?Time Frequency

×

=

=

Eric Price (MIT) Sparse Recovery and Fourier Sampling 17 / 37

n-dimensional DFT:O(n log n)x → x

n-dimensional DFT of firstk terms: O(n log n)x · rect→ x ∗ sinc.

k -dimensional DFT offirst k terms: O(B log B)alias(x · rect)→subsample(x ∗ sinc).

Page 76: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

How can you isolate frequencies?Time Frequency

×

=

=

Eric Price (MIT) Sparse Recovery and Fourier Sampling 17 / 37

n-dimensional DFT:O(n log n)x → x

n-dimensional DFT of firstk terms: O(n log n)x · rect→ x ∗ sinc.

k -dimensional DFT offirst k terms: O(B log B)alias(x · rect)→subsample(x ∗ sinc).

Page 77: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

How can you isolate frequencies?Time Frequency

×

=

=

Eric Price (MIT) Sparse Recovery and Fourier Sampling 17 / 37

n-dimensional DFT:O(n log n)x → x

n-dimensional DFT of firstk terms: O(n log n)x · rect→ x ∗ sinc.

k -dimensional DFT offirst k terms: O(B log B)alias(x · rect)→subsample(x ∗ sinc).

Page 78: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

How can you isolate frequencies?Time Frequency

×

=

=

Eric Price (MIT) Sparse Recovery and Fourier Sampling 17 / 37

n-dimensional DFT:O(n log n)x → x

n-dimensional DFT of firstk terms: O(n log n)x · rect→ x ∗ sinc.

k -dimensional DFT offirst k terms: O(B log B)alias(x · rect)→subsample(x ∗ sinc).

Page 79: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

The issue

Frequency

We want to isolate frequencies.

The sinc filter “leaks”.Contamination from other buckets.

We introduce a better filter:(Gaussian / prolate spheroidal sequence) convolved with rectangle.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 18 / 37

Page 80: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

The issue

Frequency

We want to isolate frequencies.

The sinc filter “leaks”.Contamination from other buckets.

We introduce a better filter:(Gaussian / prolate spheroidal sequence) convolved with rectangle.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 18 / 37

Page 81: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

The issue

Frequency

We want to isolate frequencies.

The sinc filter “leaks”.Contamination from other buckets.

We introduce a better filter:(Gaussian / prolate spheroidal sequence) convolved with rectangle.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 18 / 37

Page 82: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for exactly sparse signalsOriginal signal x Goal x

LemmaIf t is isolated in its bucket and in the “super-pass” region, the value bwe compute for its bucket satisfies

b = xt .

Computing the b for all O(k) buckets takes O(k log n) time.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 19 / 37

Page 83: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for exactly sparse signalsComputed F ·x Filtered signal F ∗x

LemmaIf t is isolated in its bucket and in the “super-pass” region, the value bwe compute for its bucket satisfies

b = xt .

Computing the b for all O(k) buckets takes O(k log n) time.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 19 / 37

Page 84: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for exactly sparse signalsF ·x aliased to k terms Filtered signal F ∗x

LemmaIf t is isolated in its bucket and in the “super-pass” region, the value bwe compute for its bucket satisfies

b = xt .

Computing the b for all O(k) buckets takes O(k log n) time.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 19 / 37

Page 85: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for exactly sparse signalsF ·x aliased to k terms Computed samples of F ∗x

LemmaIf t is isolated in its bucket and in the “super-pass” region, the value bwe compute for its bucket satisfies

b = xt .

Computing the b for all O(k) buckets takes O(k log n) time.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 19 / 37

Page 86: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for exactly sparse signalsF ·x aliased to k terms Computed samples of F ∗x

LemmaIf t is isolated in its bucket and in the “super-pass” region, the value bwe compute for its bucket satisfies

b = xt .

Computing the b for all O(k) buckets takes O(k log n) time.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 19 / 37

Page 87: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for exactly sparse signalsF ·x aliased to k terms Knowledge about x

LemmaIf t is isolated in its bucket and in the “super-pass” region, the value bwe compute for its bucket satisfies

b = xt .

Computing the b for all O(k) buckets takes O(k log n) time.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 19 / 37

Page 88: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for exactly sparse signalsF ·x aliased to k terms Knowledge about x

LemmaIf t is isolated in its bucket and in the “super-pass” region, the value bwe compute for its bucket satisfies

b = xt .

Computing the b for all O(k) buckets takes O(k log n) time.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 19 / 37

Page 89: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for exactly sparse signalsF ·x aliased to k terms Knowledge about x

LemmaIf t is isolated in its bucket and in the “super-pass” region, the value bwe compute for its bucket satisfies

b = xt .

Computing the b for all O(k) buckets takes O(k log n) time.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 19 / 37

Page 90: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for exactly sparse signalsLemmaFor most t, the value b we compute for its bucket satisfies

b = xt .

Computing the b for all O(k) buckets takes O(k log n) time.

Time-shift x by one and repeat: b ′ = xtωt .

Divide to get b ′/b = ωt

=⇒ can compute t .I Just like our 1-sparse recovery algorithm, x1/x0 = ωt .

Gives partial sparse recovery: x ′ such that x − x ′ is k/2-sparse.

Permute Filters O(k)

1-sparse recovery

1-sparse recovery

1-sparse recovery

1-sparse recovery

x x ′

Repeat k → k/2→ k/4→ · · ·O(k log n) time sparse Fourier transform.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 20 / 37

Page 91: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for exactly sparse signalsLemmaFor most t, the value b we compute for its bucket satisfies

b = xt .

Computing the b for all O(k) buckets takes O(k log n) time.

Time-shift x by one and repeat: b ′ = xtωt .

Divide to get b ′/b = ωt

=⇒ can compute t .I Just like our 1-sparse recovery algorithm, x1/x0 = ωt .

Gives partial sparse recovery: x ′ such that x − x ′ is k/2-sparse.

Permute Filters O(k)

1-sparse recovery

1-sparse recovery

1-sparse recovery

1-sparse recovery

x x ′

Repeat k → k/2→ k/4→ · · ·O(k log n) time sparse Fourier transform.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 20 / 37

Page 92: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for exactly sparse signalsLemmaFor most t, the value b we compute for its bucket satisfies

b = xt .

Computing the b for all O(k) buckets takes O(k log n) time.

Time-shift x by one and repeat: b ′ = xtωt .

Divide to get b ′/b = ωt =⇒ can compute t .

I Just like our 1-sparse recovery algorithm, x1/x0 = ωt .Gives partial sparse recovery: x ′ such that x − x ′ is k/2-sparse.

Permute Filters O(k)

1-sparse recovery

1-sparse recovery

1-sparse recovery

1-sparse recovery

x x ′

Repeat k → k/2→ k/4→ · · ·O(k log n) time sparse Fourier transform.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 20 / 37

Page 93: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for exactly sparse signalsLemmaFor most t, the value b we compute for its bucket satisfies

b = xt .

Computing the b for all O(k) buckets takes O(k log n) time.

Time-shift x by one and repeat: b ′ = xtωt .

Divide to get b ′/b = ωt =⇒ can compute t .I Just like our 1-sparse recovery algorithm, x1/x0 = ωt .

Gives partial sparse recovery: x ′ such that x − x ′ is k/2-sparse.

Permute Filters O(k)

1-sparse recovery

1-sparse recovery

1-sparse recovery

1-sparse recovery

x x ′

Repeat k → k/2→ k/4→ · · ·O(k log n) time sparse Fourier transform.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 20 / 37

Page 94: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for exactly sparse signalsLemmaFor most t, the value b we compute for its bucket satisfies

b = xt .

Computing the b for all O(k) buckets takes O(k log n) time.

Time-shift x by one and repeat: b ′ = xtωt .

Divide to get b ′/b = ωt =⇒ can compute t .I Just like our 1-sparse recovery algorithm, x1/x0 = ωt .

Gives partial sparse recovery: x ′ such that x − x ′ is k/2-sparse.

Permute Filters O(k)

1-sparse recovery

1-sparse recovery

1-sparse recovery

1-sparse recovery

x x ′

Repeat k → k/2→ k/4→ · · ·O(k log n) time sparse Fourier transform.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 20 / 37

Page 95: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for exactly sparse signalsLemmaFor most t, the value b we compute for its bucket satisfies

b = xt .

Computing the b for all O(k) buckets takes O(k log n) time.

Time-shift x by one and repeat: b ′ = xtωt .

Divide to get b ′/b = ωt =⇒ can compute t .I Just like our 1-sparse recovery algorithm, x1/x0 = ωt .

Gives partial sparse recovery: x ′ such that x − x ′ is k/2-sparse.

Permute Filters O(k)

1-sparse recovery

1-sparse recovery

1-sparse recovery

1-sparse recovery

x x ′

Repeat k → k/2→ k/4→ · · ·

O(k log n) time sparse Fourier transform.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 20 / 37

Page 96: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for exactly sparse signalsLemmaFor most t, the value b we compute for its bucket satisfies

b = xt .

Computing the b for all O(k) buckets takes O(k log n) time.

Time-shift x by one and repeat: b ′ = xtωt .

Divide to get b ′/b = ωt =⇒ can compute t .I Just like our 1-sparse recovery algorithm, x1/x0 = ωt .

Gives partial sparse recovery: x ′ such that x − x ′ is k/2-sparse.

Permute Filters O(k)

1-sparse recovery

1-sparse recovery

1-sparse recovery

1-sparse recovery

x x ′

Repeat k → k/2→ k/4→ · · ·O(k log n) time sparse Fourier transform.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 20 / 37

Page 97: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for approximately sparse signals

What changes with noise?Identical architecture:

Permute Filters O(k)

1-sparse recovery

1-sparse recovery

1-sparse recovery

1-sparse recovery

Partial sparse recovery

x x ′

Just requires robust 1-sparse recovery.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 21 / 37

Page 98: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for approximately sparse signals

What changes with noise?

Identical architecture:

Permute Filters O(k)

1-sparse recovery

1-sparse recovery

1-sparse recovery

1-sparse recovery

Partial sparse recovery

x x ′

Just requires robust 1-sparse recovery.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 21 / 37

Page 99: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for approximately sparse signals

What changes with noise?Identical architecture:

Permute Filters O(k)

1-sparse recovery

1-sparse recovery

1-sparse recovery

1-sparse recovery

Partial sparse recovery

x x ′

Just requires robust 1-sparse recovery.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 21 / 37

Page 100: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for approximately sparse signals

What changes with noise?Identical architecture:

Permute Filters O(k)

1-sparse recovery

1-sparse recovery

1-sparse recovery

1-sparse recovery

Partial sparse recovery

x x ′

Just requires robust 1-sparse recovery.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 21 / 37

Page 101: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for approximately sparse signals: k = 1LemmaSuppose x is approximately 1-sparse:

|xt |/‖x‖2 > 90%.

Then we can recover it with O(log n) samples and O(log2 n) time.

With exact sparsity: log n bits in a single measurement.With noise: only constant number of useful bits.Choose Θ(log n) time shifts c to recover i .Error correcting code with efficient recovery =⇒ Lemma.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 22 / 37

Page 102: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for approximately sparse signals: k = 1LemmaSuppose x is approximately 1-sparse:

|xt |/‖x‖2 > 90%.

Then we can recover it with O(log n) samples and O(log2 n) time.

x1/x0 = ωt

With exact sparsity: log n bits in a single measurement.

With noise: only constant number of useful bits.Choose Θ(log n) time shifts c to recover i .Error correcting code with efficient recovery =⇒ Lemma.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 22 / 37

Page 103: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for approximately sparse signals: k = 1LemmaSuppose x is approximately 1-sparse:

|xt |/‖x‖2 > 90%.

Then we can recover it with O(log n) samples and O(log2 n) time.

x1/x0 = ωt + noise

With exact sparsity: log n bits in a single measurement.With noise: only constant number of useful bits.

Choose Θ(log n) time shifts c to recover i .Error correcting code with efficient recovery =⇒ Lemma.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 22 / 37

Page 104: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for approximately sparse signals: k = 1LemmaSuppose x is approximately 1-sparse:

|xt |/‖x‖2 > 90%.

Then we can recover it with O(log n) samples and O(log2 n) time.

x1/x0 = ωt + noise

With exact sparsity: log n bits in a single measurement.With noise: only constant number of useful bits.Choose Θ(log n) time shifts c to recover i .

Error correcting code with efficient recovery =⇒ Lemma.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 22 / 37

Page 105: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for approximately sparse signals: k = 1LemmaSuppose x is approximately 1-sparse:

|xt |/‖x‖2 > 90%.

Then we can recover it with O(log n) samples and O(log2 n) time.

xc2/x0 = ωc2t + noise

With exact sparsity: log n bits in a single measurement.With noise: only constant number of useful bits.Choose Θ(log n) time shifts c to recover i .

Error correcting code with efficient recovery =⇒ Lemma.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 22 / 37

Page 106: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for approximately sparse signals: k = 1LemmaSuppose x is approximately 1-sparse:

|xt |/‖x‖2 > 90%.

Then we can recover it with O(log n) samples and O(log2 n) time.

xc3/x0 = ωc3t + noise

With exact sparsity: log n bits in a single measurement.With noise: only constant number of useful bits.Choose Θ(log n) time shifts c to recover i .

Error correcting code with efficient recovery =⇒ Lemma.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 22 / 37

Page 107: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for approximately sparse signals: k = 1LemmaSuppose x is approximately 1-sparse:

|xt |/‖x‖2 > 90%.

Then we can recover it with O(log n) samples and O(log2 n) time.

xc3/x0 = ωc3t + noise

With exact sparsity: log n bits in a single measurement.With noise: only constant number of useful bits.Choose Θ(log n) time shifts c to recover i .Error correcting code with efficient recovery =⇒ Lemma.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 22 / 37

Page 108: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for approximately sparse signals: general kLemmaIf x is approximately 1-sparse, we can recover it with O(log n) samplesand O(log2 n) time.

Permute Filters O(k)

1-sparse recovery

1-sparse recovery

1-sparse recovery

1-sparse recovery

x x ′

Reduce k -sparse to 1-sparse on buckets of size n/k , with log noverhead per sample.

TheoremIf x is approximately k-sparse, we can recover it inO(k log(n/k) log n) time.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 23 / 37

Page 109: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for approximately sparse signals: general kLemmaIf x is approximately 1-sparse, we can recover it with O(log n) samplesand O(log2 n) time.

Permute Filters O(k)

1-sparse recovery

1-sparse recovery

1-sparse recovery

1-sparse recovery

x x ′

Reduce k -sparse to 1-sparse on buckets of size n/k , with log noverhead per sample.

TheoremIf x is approximately k-sparse, we can recover it inO(k log(n/k) log n) time.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 23 / 37

Page 110: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Algorithm for approximately sparse signals: general kLemmaIf x is approximately 1-sparse, we can recover it with O(log n) samplesand O(log2 n) time.

Permute Filters O(k)

1-sparse recovery

1-sparse recovery

1-sparse recovery

1-sparse recovery

x x ′

Reduce k -sparse to 1-sparse on buckets of size n/k , with log noverhead per sample.

TheoremIf x is approximately k-sparse, we can recover it inO(k log(n/k) log n) time.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 23 / 37

Page 111: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Empirical performance

Compare toI FFTW, the “Fastest Fourier Transform in the West”I AAFFT, the [GMS05] sparse Fourier transform.

0.001

0.01

0.1

1

10

26

27

28

29

210

211

212

213

214

215

216

217

218

Ru

n T

ime (

sec)

Sparsity (K)

Run Time vs Signal Sparsity (N=222

)

sFFT 3.0 (Exact)

FFTW

AAFFT 0.9

1e-05

0.0001

0.001

0.01

0.1

1

10

210

211

212

213

214

215

216

217

218

219

220

221

222

223

224

Ru

n T

ime (

sec)

Signal Size (n)

Run Time vs Signal Size (k=50)

sFFT 3.0 (Exact)

FFTW

AAFFT 0.9

Faster than FFTW for wide range of values.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 24 / 37

Page 112: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Empirical performance

Compare toI FFTW, the “Fastest Fourier Transform in the West”I AAFFT, the [GMS05] sparse Fourier transform.

0.001

0.01

0.1

1

10

26

27

28

29

210

211

212

213

214

215

216

217

218

Ru

n T

ime (

sec)

Sparsity (K)

Run Time vs Signal Sparsity (N=222

)

sFFT 3.0 (Exact)

FFTW

AAFFT 0.9

1e-05

0.0001

0.001

0.01

0.1

1

10

210

211

212

213

214

215

216

217

218

219

220

221

222

223

224

Ru

n T

ime (

sec)

Signal Size (n)

Run Time vs Signal Size (k=50)

sFFT 3.0 (Exact)

FFTW

AAFFT 0.9

Faster than FFTW for wide range of values.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 24 / 37

Page 113: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Empirical performance

Compare toI FFTW, the “Fastest Fourier Transform in the West”I AAFFT, the [GMS05] sparse Fourier transform.

0.001

0.01

0.1

1

10

26

27

28

29

210

211

212

213

214

215

216

217

218

Ru

n T

ime (

sec)

Sparsity (K)

Run Time vs Signal Sparsity (N=222

)

sFFT 3.0 (Exact)

FFTW

AAFFT 0.9

1e-05

0.0001

0.001

0.01

0.1

1

10

210

211

212

213

214

215

216

217

218

219

220

221

222

223

224

Ru

n T

ime (

sec)

Signal Size (n)

Run Time vs Signal Size (k=50)

sFFT 3.0 (Exact)

FFTW

AAFFT 0.9

Faster than FFTW for wide range of values.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 24 / 37

Page 114: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Empirical performance

Compare toI FFTW, the “Fastest Fourier Transform in the West”I AAFFT, the [GMS05] sparse Fourier transform.

0.001

0.01

0.1

1

10

26

27

28

29

210

211

212

213

214

215

216

217

218

Ru

n T

ime (

sec)

Sparsity (K)

Run Time vs Signal Sparsity (N=222

)

sFFT 3.0 (Exact)

FFTW

AAFFT 0.9

1e-05

0.0001

0.001

0.01

0.1

1

10

210

211

212

213

214

215

216

217

218

219

220

221

222

223

224

Ru

n T

ime (

sec)

Signal Size (n)

Run Time vs Signal Size (k=50)

sFFT 3.0 (Exact)

FFTW

AAFFT 0.9

Faster than FFTW for wide range of values.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 24 / 37

Page 115: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Recap of Sparse Fourier Transform

Theory:I The fastest algorithm for Fourier transforms of sparse data.I The only algorithms faster than FFT for all k = o(n).

Practice:I Implementation is faster than FFTW for a wide range of inputs.I Orders of magnitude faster than previous sparse Fourier transforms.I Useful in multiple applications.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 25 / 37

Page 116: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Recap of Sparse Fourier Transform

Theory:I The fastest algorithm for Fourier transforms of sparse data.I The only algorithms faster than FFT for all k = o(n).

Practice:I Implementation is faster than FFTW for a wide range of inputs.I Orders of magnitude faster than previous sparse Fourier transforms.I Useful in multiple applications.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 25 / 37

Page 117: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Talk Outline

1 Sparse Fourier TransformOverviewTechnical Details

2 Beyond: Sparse Recovery / Compressive SensingOverviewAdaptivityConclusion

Eric Price (MIT) Sparse Recovery and Fourier Sampling 26 / 37

Page 118: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Sparse Recovery / Compressive Sensing

Robustly recover sparse x from linear measurements y = Ax .

x

Ay =

Sparse Fourier MRI Single-Pixel Camera

Streaming AlgorithmsA(x + ∆) = Ax + A∆

Genetic Testing

Eric Price (MIT) Sparse Recovery and Fourier Sampling 27 / 37

Page 119: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Sparse Recovery / Compressive Sensing

Robustly recover sparse x from linear measurements y = Ax .

x

Ay =Sparse Fourier

MRI Single-Pixel Camera

Streaming AlgorithmsA(x + ∆) = Ax + A∆

Genetic Testing

Eric Price (MIT) Sparse Recovery and Fourier Sampling 27 / 37

Page 120: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Sparse Recovery / Compressive Sensing

Robustly recover sparse x from linear measurements y = Ax .

x

F−1S

yxS =

Sparse Fourier transform:looks at set S of coordinates

Sparse Fourier

MRI Single-Pixel Camera

Streaming AlgorithmsA(x + ∆) = Ax + A∆

Genetic Testing

Eric Price (MIT) Sparse Recovery and Fourier Sampling 27 / 37

Page 121: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Sparse Recovery / Compressive Sensing

Robustly recover sparse x from linear measurements y = Ax .

x

Ay =

Sparse Fourier MRI

Single-Pixel Camera

Streaming AlgorithmsA(x + ∆) = Ax + A∆

Genetic Testing

Eric Price (MIT) Sparse Recovery and Fourier Sampling 27 / 37

Page 122: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Sparse Recovery / Compressive Sensing

Robustly recover sparse x from linear measurements y = Ax .

x

Ay =

Sparse Fourier MRI Single-Pixel Camera

Streaming AlgorithmsA(x + ∆) = Ax + A∆

Genetic Testing

Eric Price (MIT) Sparse Recovery and Fourier Sampling 27 / 37

Page 123: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Sparse Recovery / Compressive Sensing

Robustly recover sparse x from linear measurements y = Ax .

x

Ay =

Sparse Fourier MRI Single-Pixel Camera

Streaming AlgorithmsA(x + ∆) = Ax + A∆

Genetic Testing

Eric Price (MIT) Sparse Recovery and Fourier Sampling 27 / 37

Page 124: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Sparse Recovery / Compressive Sensing

Robustly recover sparse x from linear measurements y = Ax .

x

Ay =

Sparse Fourier MRI Single-Pixel Camera

Streaming AlgorithmsA(x + ∆) = Ax + A∆

Genetic Testing

Eric Price (MIT) Sparse Recovery and Fourier Sampling 27 / 37

Page 125: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

My Contributions

Sparse Fourier: minimize time complexity [HIKP12b, HIKP12a]

MRI: minimize Fourier sample complexity [GHIKPS13, IKP14]Camera: use Earth-Mover Distance metric [IP11, GIP10, GIPR11]Streaming: improved analysis of Count-Sketch [MP14, PW11, P11]Genetic testing: first asymptotic gain using adaptivity [IPW11, PW13]

Eric Price (MIT) Sparse Recovery and Fourier Sampling 28 / 37

Page 126: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

My Contributions

Sparse Fourier: minimize time complexity [HIKP12b, HIKP12a]MRI: minimize Fourier sample complexity [GHIKPS13, IKP14]

Camera: use Earth-Mover Distance metric [IP11, GIP10, GIPR11]Streaming: improved analysis of Count-Sketch [MP14, PW11, P11]Genetic testing: first asymptotic gain using adaptivity [IPW11, PW13]

Eric Price (MIT) Sparse Recovery and Fourier Sampling 28 / 37

Page 127: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

My Contributions

Sparse Fourier: minimize time complexity [HIKP12b, HIKP12a]MRI: minimize Fourier sample complexity [GHIKPS13, IKP14]Camera: use Earth-Mover Distance metric [IP11, GIP10, GIPR11]

Streaming: improved analysis of Count-Sketch [MP14, PW11, P11]Genetic testing: first asymptotic gain using adaptivity [IPW11, PW13]

Eric Price (MIT) Sparse Recovery and Fourier Sampling 28 / 37

Page 128: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

My Contributions

Sparse Fourier: minimize time complexity [HIKP12b, HIKP12a]MRI: minimize Fourier sample complexity [GHIKPS13, IKP14]Camera: use Earth-Mover Distance metric [IP11, GIP10, GIPR11]Streaming: improved analysis of Count-Sketch [MP14, PW11, P11]

Genetic testing: first asymptotic gain using adaptivity [IPW11, PW13]

Eric Price (MIT) Sparse Recovery and Fourier Sampling 28 / 37

Page 129: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

My Contributions

Sparse Fourier: minimize time complexity [HIKP12b, HIKP12a]MRI: minimize Fourier sample complexity [GHIKPS13, IKP14]Camera: use Earth-Mover Distance metric [IP11, GIP10, GIPR11]Streaming: improved analysis of Count-Sketch [MP14, PW11, P11]Genetic testing: first asymptotic gain using adaptivity [IPW11, PW13]

Eric Price (MIT) Sparse Recovery and Fourier Sampling 28 / 37

Page 130: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

My Contributions

Sparse Fourier: minimize time complexity [HIKP12b, HIKP12a]MRI: minimize Fourier sample complexity [GHIKPS13, IKP14]Camera: use Earth-Mover Distance metric [IP11, GIP10, GIPR11]Streaming: improved analysis of Count-Sketch [MP14, PW11, P11]Genetic testing: first asymptotic gain using adaptivity [IPW11, PW13]

Eric Price (MIT) Sparse Recovery and Fourier Sampling 28 / 37

Page 131: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Adaptive Sparse Recovery Model

Unknown approximately k -sparse vector x ∈ Rn.

Choose v ∈ Rn, observe y = 〈v , x〉.Choose another v and repeat as needed.Output x ′ satisfying

‖x ′ − x‖2 < (1 + ε) mink -sparse x(k)

‖x − x(k)‖2

Nonadaptively: Θ(k log(n/k)) measurements necessary andsufficient. [Candes-Romberg-Tao ’06, DIPW ’10]Natural question: does adaptivity help?

I Studied in [MSW08, JXC08, CHNR08, AWZ08, HCN09, ACD11, ...]

First asymptotic improvement: O(k log log(n/k)) measurements.[IPW ’11]

Eric Price (MIT) Sparse Recovery and Fourier Sampling 29 / 37

Page 132: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Adaptive Sparse Recovery Model

Unknown approximately k -sparse vector x ∈ Rn.Choose v ∈ Rn, observe y = 〈v , x〉.

Choose another v and repeat as needed.Output x ′ satisfying

‖x ′ − x‖2 < (1 + ε) mink -sparse x(k)

‖x − x(k)‖2

Nonadaptively: Θ(k log(n/k)) measurements necessary andsufficient. [Candes-Romberg-Tao ’06, DIPW ’10]Natural question: does adaptivity help?

I Studied in [MSW08, JXC08, CHNR08, AWZ08, HCN09, ACD11, ...]

First asymptotic improvement: O(k log log(n/k)) measurements.[IPW ’11]

Eric Price (MIT) Sparse Recovery and Fourier Sampling 29 / 37

Page 133: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Adaptive Sparse Recovery Model

Unknown approximately k -sparse vector x ∈ Rn.Choose v ∈ Rn, observe y = 〈v , x〉.Choose another v and repeat as needed.

Output x ′ satisfying

‖x ′ − x‖2 < (1 + ε) mink -sparse x(k)

‖x − x(k)‖2

Nonadaptively: Θ(k log(n/k)) measurements necessary andsufficient. [Candes-Romberg-Tao ’06, DIPW ’10]Natural question: does adaptivity help?

I Studied in [MSW08, JXC08, CHNR08, AWZ08, HCN09, ACD11, ...]

First asymptotic improvement: O(k log log(n/k)) measurements.[IPW ’11]

Eric Price (MIT) Sparse Recovery and Fourier Sampling 29 / 37

Page 134: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Adaptive Sparse Recovery Model

Unknown approximately k -sparse vector x ∈ Rn.Choose v ∈ Rn, observe y = 〈v , x〉.Choose another v and repeat as needed.Output x ′ satisfying

‖x ′ − x‖2 < (1 + ε) mink -sparse x(k)

‖x − x(k)‖2

Nonadaptively: Θ(k log(n/k)) measurements necessary andsufficient. [Candes-Romberg-Tao ’06, DIPW ’10]Natural question: does adaptivity help?

I Studied in [MSW08, JXC08, CHNR08, AWZ08, HCN09, ACD11, ...]

First asymptotic improvement: O(k log log(n/k)) measurements.[IPW ’11]

Eric Price (MIT) Sparse Recovery and Fourier Sampling 29 / 37

Page 135: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Adaptive Sparse Recovery Model

Unknown approximately k -sparse vector x ∈ Rn.Choose v ∈ Rn, observe y = 〈v , x〉.Choose another v and repeat as needed.Output x ′ satisfying

‖x ′ − x‖2 < (1 + ε) mink -sparse x(k)

‖x − x(k)‖2

Nonadaptively: Θ(k log(n/k)) measurements necessary andsufficient. [Candes-Romberg-Tao ’06, DIPW ’10]

Natural question: does adaptivity help?

I Studied in [MSW08, JXC08, CHNR08, AWZ08, HCN09, ACD11, ...]

First asymptotic improvement: O(k log log(n/k)) measurements.[IPW ’11]

Eric Price (MIT) Sparse Recovery and Fourier Sampling 29 / 37

Page 136: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Adaptive Sparse Recovery Model

Unknown approximately k -sparse vector x ∈ Rn.Choose v ∈ Rn, observe y = 〈v , x〉.Choose another v and repeat as needed.Output x ′ satisfying

‖x ′ − x‖2 < (1 + ε) mink -sparse x(k)

‖x − x(k)‖2

Nonadaptively: Θ(k log(n/k)) measurements necessary andsufficient. [Candes-Romberg-Tao ’06, DIPW ’10]Natural question: does adaptivity help?

I Studied in [MSW08, JXC08, CHNR08, AWZ08, HCN09, ACD11, ...]

First asymptotic improvement: O(k log log(n/k)) measurements.[IPW ’11]

Eric Price (MIT) Sparse Recovery and Fourier Sampling 29 / 37

Page 137: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Adaptive Sparse Recovery Model

Unknown approximately k -sparse vector x ∈ Rn.Choose v ∈ Rn, observe y = 〈v , x〉.Choose another v and repeat as needed.Output x ′ satisfying

‖x ′ − x‖2 < (1 + ε) mink -sparse x(k)

‖x − x(k)‖2

Nonadaptively: Θ(k log(n/k)) measurements necessary andsufficient. [Candes-Romberg-Tao ’06, DIPW ’10]Natural question: does adaptivity help?

I Studied in [MSW08, JXC08, CHNR08, AWZ08, HCN09, ACD11, ...]

First asymptotic improvement: O(k log log(n/k)) measurements.[IPW ’11]

Eric Price (MIT) Sparse Recovery and Fourier Sampling 29 / 37

Page 138: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Adaptive Sparse Recovery Model

Unknown approximately k -sparse vector x ∈ Rn.Choose v ∈ Rn, observe y = 〈v , x〉.Choose another v and repeat as needed.Output x ′ satisfying

‖x ′ − x‖2 < (1 + ε) mink -sparse x(k)

‖x − x(k)‖2

Nonadaptively: Θ(k log(n/k)) measurements necessary andsufficient. [Candes-Romberg-Tao ’06, DIPW ’10]Natural question: does adaptivity help?

I Studied in [MSW08, JXC08, CHNR08, AWZ08, HCN09, ACD11, ...]

First asymptotic improvement: O(k log log(n/k)) measurements.[IPW ’11]

Eric Price (MIT) Sparse Recovery and Fourier Sampling 29 / 37

Page 139: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Applications of Adaptivity

Eric Price (MIT) Sparse Recovery and Fourier Sampling 30 / 37

Page 140: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Applications of Adaptivity

Eric Price (MIT) Sparse Recovery and Fourier Sampling 30 / 37

Page 141: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Applications of Adaptivity

Eric Price (MIT) Sparse Recovery and Fourier Sampling 30 / 37

Page 142: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Applications of Adaptivity

Eric Price (MIT) Sparse Recovery and Fourier Sampling 30 / 37

Page 143: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Applications of Adaptivity

Eric Price (MIT) Sparse Recovery and Fourier Sampling 30 / 37

Page 144: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Applications of Adaptivity

Eric Price (MIT) Sparse Recovery and Fourier Sampling 30 / 37

Page 145: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Applications of Adaptivity

Eric Price (MIT) Sparse Recovery and Fourier Sampling 30 / 37

Page 146: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Outline of Algorithm

TheoremAdaptive k-sparse recovery is possible with O(k log log(n/k))measurements.

Permute Partition O(k)

1-sparse recovery

1-sparse recovery

1-sparse recovery

1-sparse recovery

x x ′

Suffices to solve for k = 1:

LemmaAdaptive 1-sparse recovery is possible with O(log log n)measurements.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 31 / 37

Page 147: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Outline of Algorithm

TheoremAdaptive k-sparse recovery is possible with O(k log log(n/k))measurements.

Permute Partition O(k)

1-sparse recovery

1-sparse recovery

1-sparse recovery

1-sparse recovery

x x ′

Suffices to solve for k = 1:

LemmaAdaptive 1-sparse recovery is possible with O(log log n)measurements.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 31 / 37

Page 148: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Outline of Algorithm

TheoremAdaptive k-sparse recovery is possible with O(k log log(n/k))measurements.

Permute Partition O(k)

1-sparse recovery

1-sparse recovery

1-sparse recovery

1-sparse recovery

x x ′

Suffices to solve for k = 1:

LemmaAdaptive 1-sparse recovery is possible with O(log log n)measurements.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 31 / 37

Page 149: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Outline of Algorithm

TheoremAdaptive k-sparse recovery is possible with O(k log log(n/k))measurements.

Permute Partition O(k)

1-sparse recovery

1-sparse recovery

1-sparse recovery

1-sparse recovery

x x ′

Suffices to solve for k = 1:

LemmaAdaptive 1-sparse recovery is possible with O(log log n)measurements.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 31 / 37

Page 150: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

1-sparse recovery: non-adaptive lower bound

LemmaAdaptive 1-sparse recovery is possible with O(log log n)measurements.

Non-adaptive lower bound: why is this hard?Hard case: x is random ei plus Gaussian noise w with ‖w‖2 ≈ 1.

Robust recovery must locate i .

Observations 〈v , x〉 = vi + 〈v ,w〉 = vi +‖v‖2√

n z, for z ∼ N(0,1).

Eric Price (MIT) Sparse Recovery and Fourier Sampling 32 / 37

Page 151: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

1-sparse recovery: non-adaptive lower bound

LemmaAdaptive 1-sparse recovery is possible with O(log log n)measurements.

Non-adaptive lower bound: why is this hard?

Hard case: x is random ei plus Gaussian noise w with ‖w‖2 ≈ 1.

Robust recovery must locate i .

Observations 〈v , x〉 = vi + 〈v ,w〉 = vi +‖v‖2√

n z, for z ∼ N(0,1).

Eric Price (MIT) Sparse Recovery and Fourier Sampling 32 / 37

Page 152: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

1-sparse recovery: non-adaptive lower bound

LemmaAdaptive 1-sparse recovery is possible with O(log log n)measurements.

Non-adaptive lower bound: why is this hard?Hard case: x is random ei plus Gaussian noise w with ‖w‖2 ≈ 1.

Robust recovery must locate i .

Observations 〈v , x〉 = vi + 〈v ,w〉 = vi +‖v‖2√

n z, for z ∼ N(0,1).

Eric Price (MIT) Sparse Recovery and Fourier Sampling 32 / 37

Page 153: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

1-sparse recovery: non-adaptive lower bound

LemmaAdaptive 1-sparse recovery is possible with O(log log n)measurements.

Non-adaptive lower bound: why is this hard?Hard case: x is random ei plus Gaussian noise w with ‖w‖2 ≈ 1.

Robust recovery must locate i .

Observations 〈v , x〉 = vi + 〈v ,w〉 = vi +‖v‖2√

n z, for z ∼ N(0,1).

Eric Price (MIT) Sparse Recovery and Fourier Sampling 32 / 37

Page 154: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

1-sparse recovery: non-adaptive lower bound

LemmaAdaptive 1-sparse recovery is possible with O(log log n)measurements.

Non-adaptive lower bound: why is this hard?Hard case: x is random ei plus Gaussian noise w with ‖w‖2 ≈ 1.

Robust recovery must locate i .

Observations 〈v , x〉 = vi + 〈v ,w〉 = vi +‖v‖2√

n z, for z ∼ N(0,1).

Eric Price (MIT) Sparse Recovery and Fourier Sampling 32 / 37

Page 155: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

1-sparse recovery: non-adaptive lower boundObserve 〈v , x〉 = vi +

‖v‖2√n z, where z ∼ N(0,1)

Shannon 1948: information capacity

I(i , 〈v , x〉) 6 12

log(1 + SNR)

where SNR denotes the “signal-to-noise ratio,”

SNR =E[signal2]E[noise2]

=E[v2

i ]

‖v‖22/n= 1

Finding i needs Ω(log n) non-adaptive measurements.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 33 / 37

Page 156: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

1-sparse recovery: non-adaptive lower boundObserve 〈v , x〉 = vi +

‖v‖2√n z, where z ∼ N(0,1)

Shannon 1948: information capacity

I(i , 〈v , x〉) 6 12

log(1 + SNR)

where SNR denotes the “signal-to-noise ratio,”

SNR =E[signal2]E[noise2]

=E[v2

i ]

‖v‖22/n= 1

Finding i needs Ω(log n) non-adaptive measurements.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 33 / 37

Page 157: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

1-sparse recovery: non-adaptive lower boundObserve 〈v , x〉 = vi +

‖v‖2√n z, where z ∼ N(0,1)

Shannon 1948: information capacity

I(i , 〈v , x〉) 6 12

log(1 + SNR)

where SNR denotes the “signal-to-noise ratio,”

SNR =E[signal2]E[noise2]

=E[v2

i ]

‖v‖22/n= 1

Finding i needs Ω(log n) non-adaptive measurements.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 33 / 37

Page 158: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

1-sparse recovery: non-adaptive lower boundObserve 〈v , x〉 = vi +

‖v‖2√n z, where z ∼ N(0,1)

Shannon 1948: information capacity

I(i , 〈v , x〉) 6 12

log(1 + SNR)

where SNR denotes the “signal-to-noise ratio,”

SNR =E[signal2]E[noise2]

=E[v2

i ]

‖v‖22/n= 1

Finding i needs Ω(log n) non-adaptive measurements.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 33 / 37

Page 159: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

1-sparse recovery: non-adaptive lower boundObserve 〈v , x〉 = vi +

‖v‖2√n z, where z ∼ N(0,1)

Shannon 1948: information capacity

I(i , 〈v , x〉) 6 12

log(1 + SNR)

where SNR denotes the “signal-to-noise ratio,”

SNR =E[signal2]E[noise2]

=E[v2

i ]

‖v‖22/n= 1

Finding i needs Ω(log n) non-adaptive measurements.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 33 / 37

Page 160: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

1-sparse recovery: non-adaptive lower boundObserve 〈v , x〉 = vi +

‖v‖2√n z, where z ∼ N(0,1)

Shannon 1948: information capacity

I(i , 〈v , x〉) 6 12

log(1 + SNR)

where SNR denotes the “signal-to-noise ratio,”

SNR =E[signal2]E[noise2]

=E[v2

i ]

‖v‖22/n= 1

Finding i needs Ω(log n) non-adaptive measurements.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 33 / 37

Page 161: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

1-sparse recovery: non-adaptive lower boundObserve 〈v , x〉 = vi +

‖v‖2√n z, where z ∼ N(0,1)

Shannon 1948: information capacity

I(i , 〈v , x〉) 6 12

log(1 + SNR)

where SNR denotes the “signal-to-noise ratio,”

SNR =E[signal2]E[noise2]

=E[v2

i ]

‖v‖22/n= 1

Finding i needs Ω(log n) non-adaptive measurements.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 33 / 37

Page 162: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

1-sparse recovery: non-adaptive lower boundObserve 〈v , x〉 = vi +

‖v‖2√n z, where z ∼ N(0,1)

Shannon 1948: information capacity

I(i , 〈v , x〉) 6 12

log(1 + SNR)

where SNR denotes the “signal-to-noise ratio,”

SNR =E[signal2]E[noise2]

=E[v2

i ]

‖v‖22/n= 1

Finding i needs Ω(log n) non-adaptive measurements.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 33 / 37

Page 163: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

1-sparse recovery: changes in adaptive setting

Information capacity

I(i , 〈v , x〉) 6 12

log(1 + SNR).

where SNR denotes the “signal-to-noise ratio,”

SNR =E[v2

i ]

‖v‖22/n.

If i is independent of v , this is O(1).As we learn about i , we can increase the SNR.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 34 / 37

Page 164: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

1-sparse recovery: changes in adaptive setting

Information capacity

I(i , 〈v , x〉) 6 12

log(1 + SNR).

where SNR denotes the “signal-to-noise ratio,”

SNR =E[v2

i ]

‖v‖22/n.

If i is independent of v , this is O(1).

As we learn about i , we can increase the SNR.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 34 / 37

Page 165: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

1-sparse recovery: changes in adaptive setting

Information capacity

I(i , 〈v , x〉) 6 12

log(1 + SNR).

where SNR denotes the “signal-to-noise ratio,”

SNR =E[v2

i ]

‖v‖22/n.

If i is independent of v , this is O(1).As we learn about i , we can increase the SNR.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 34 / 37

Page 166: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

1-sparse recovery: ideax = ei + w

0 bits

v

Candidate setSignal

SNR = 2 I(i , 〈v , x〉) 6 log SNR = 1〈v , x〉 = vi + 〈v ,w〉

Eric Price (MIT) Sparse Recovery and Fourier Sampling 35 / 37

Page 167: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

1-sparse recovery: ideax = ei + w

0 bits1 bit

v

Candidate setSignal

SNR = 22 I(i , 〈v , x〉) 6 log SNR = 2〈v , x〉 = vi + 〈v ,w〉

Eric Price (MIT) Sparse Recovery and Fourier Sampling 35 / 37

Page 168: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

1-sparse recovery: ideax = ei + w

0 bits1 bit

2 bits

v

Candidate setSignal

SNR = 24 I(i , 〈v , x〉) 6 log SNR = 4〈v , x〉 = vi + 〈v ,w〉

Eric Price (MIT) Sparse Recovery and Fourier Sampling 35 / 37

Page 169: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

1-sparse recovery: ideax = ei + w

0 bits1 bit

2 bits4 bits

v

Candidate setSignal

SNR = 28 I(i , 〈v , x〉) 6 log SNR = 8〈v , x〉 = vi + 〈v ,w〉

Eric Price (MIT) Sparse Recovery and Fourier Sampling 35 / 37

Page 170: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

1-sparse recovery: ideax = ei + w

0 bits1 bit

2 bits4 bits8 bits

v

Candidate setSignal

SNR = 216 I(i , 〈v , x〉) 6 log SNR = 16〈v , x〉 = vi + 〈v ,w〉

Eric Price (MIT) Sparse Recovery and Fourier Sampling 35 / 37

Page 171: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

1-sparse recovery

Lemma (IPW11)Adaptive 1-sparse recovery takes O(log log n) measurements.

0 5 10 15 20 25 30log n

0

5

10

15

20

25

30

Num

ber o

f mea

sure

men

ts

m as a function of n (SNR=10db,k=1)Gaussian measurements, L1 minimizationAdaptive measurements

0 10 20 30 40 50SNR (dB)

0

5

10

15

20

25

30

Num

ber o

f mea

sure

men

ts

m as a function of SNR (n=8192,k=1)Gaussian measurements, L1 minimizationAdaptive measurements

Gives Θ(k log log(n/k)) k -sparse recovery via general framework.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 36 / 37

Page 172: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

1-sparse recovery

Lemma (IPW11, PW13)Adaptive 1-sparse recovery takes Θ(log log n) measurements.

0 5 10 15 20 25 30log n

0

5

10

15

20

25

30

Num

ber o

f mea

sure

men

ts

m as a function of n (SNR=10db,k=1)Gaussian measurements, L1 minimizationAdaptive measurements

0 10 20 30 40 50SNR (dB)

0

5

10

15

20

25

30

Num

ber o

f mea

sure

men

ts

m as a function of SNR (n=8192,k=1)Gaussian measurements, L1 minimizationAdaptive measurements

Gives Θ(k log log(n/k)) k -sparse recovery via general framework.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 36 / 37

Page 173: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

1-sparse recovery

Lemma (IPW11, PW13)Adaptive 1-sparse recovery takes Θ(log log n) measurements.

0 5 10 15 20 25 30log n

0

5

10

15

20

25

30

Num

ber o

f mea

sure

men

ts

m as a function of n (SNR=10db,k=1)Gaussian measurements, L1 minimizationAdaptive measurements

0 10 20 30 40 50SNR (dB)

0

5

10

15

20

25

30

Num

ber o

f mea

sure

men

ts

m as a function of SNR (n=8192,k=1)Gaussian measurements, L1 minimizationAdaptive measurements

Gives Θ(k log log(n/k)) k -sparse recovery via general framework.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 36 / 37

Page 174: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

1-sparse recovery

Lemma (IPW11, PW13)Adaptive 1-sparse recovery takes Θ(log log n) measurements.

0 5 10 15 20 25 30log n

0

5

10

15

20

25

30

Num

ber o

f mea

sure

men

ts

m as a function of n (SNR=10db,k=1)Gaussian measurements, L1 minimizationAdaptive measurements

0 10 20 30 40 50SNR (dB)

0

5

10

15

20

25

30

Num

ber o

f mea

sure

men

ts

m as a function of SNR (n=8192,k=1)Gaussian measurements, L1 minimizationAdaptive measurements

Gives Θ(k log log(n/k)) k -sparse recovery via general framework.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 36 / 37

Page 175: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

1-sparse recovery

Lemma (IPW11, PW13)Adaptive 1-sparse recovery takes Θ(log log n) measurements.

0 5 10 15 20 25 30log n

0

5

10

15

20

25

30

Num

ber o

f mea

sure

men

ts

m as a function of n (SNR=10db,k=1)Gaussian measurements, L1 minimizationAdaptive measurements

0 10 20 30 40 50SNR (dB)

0

5

10

15

20

25

30

Num

ber o

f mea

sure

men

ts

m as a function of SNR (n=8192,k=1)Gaussian measurements, L1 minimizationAdaptive measurements

Gives Θ(k log log(n/k)) k -sparse recovery via general framework.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 36 / 37

Page 176: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Summary

Sparse Fourier transformI Fastest algorithm for Fourier transforms on sparse dataI Already has applications with substantial improvements

Broader sparse recovery theoryI Sparse Fourier: minimize time complexity [HIKP12]I MRI: minimize Fourier sample complexity [GHIKPS13, IKP14]I Camera: use Earth-Mover Distance metric [IP11, GIP10, GIPR11]I Streaming: improved analysis of Count-Sketch [MP14, PW11, P11]I Genetic testing: first asymptotic gain using adaptivity [IPW11, PW13]

Lower boundsI Based on Gaussian channel capacity: tight bounds, extensible to

adaptive settings.I Based on communication complexity: extends to `1 setting.

Thank You

Eric Price (MIT) Sparse Recovery and Fourier Sampling 37 / 37

Page 177: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Summary

Sparse Fourier transformI Fastest algorithm for Fourier transforms on sparse dataI Already has applications with substantial improvements

Broader sparse recovery theoryI Sparse Fourier: minimize time complexity [HIKP12]I MRI: minimize Fourier sample complexity [GHIKPS13, IKP14]I Camera: use Earth-Mover Distance metric [IP11, GIP10, GIPR11]I Streaming: improved analysis of Count-Sketch [MP14, PW11, P11]I Genetic testing: first asymptotic gain using adaptivity [IPW11, PW13]

Lower boundsI Based on Gaussian channel capacity: tight bounds, extensible to

adaptive settings.I Based on communication complexity: extends to `1 setting.

Thank You

Eric Price (MIT) Sparse Recovery and Fourier Sampling 37 / 37

Page 178: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Summary

Sparse Fourier transformI Fastest algorithm for Fourier transforms on sparse dataI Already has applications with substantial improvements

Broader sparse recovery theoryI Sparse Fourier: minimize time complexity [HIKP12]I MRI: minimize Fourier sample complexity [GHIKPS13, IKP14]I Camera: use Earth-Mover Distance metric [IP11, GIP10, GIPR11]I Streaming: improved analysis of Count-Sketch [MP14, PW11, P11]I Genetic testing: first asymptotic gain using adaptivity [IPW11, PW13]

Lower boundsI Based on Gaussian channel capacity: tight bounds, extensible to

adaptive settings.I Based on communication complexity: extends to `1 setting.

Thank You

Eric Price (MIT) Sparse Recovery and Fourier Sampling 37 / 37

Page 179: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Summary

Sparse Fourier transformI Fastest algorithm for Fourier transforms on sparse dataI Already has applications with substantial improvements

Broader sparse recovery theoryI Sparse Fourier: minimize time complexity [HIKP12]I MRI: minimize Fourier sample complexity [GHIKPS13, IKP14]I Camera: use Earth-Mover Distance metric [IP11, GIP10, GIPR11]I Streaming: improved analysis of Count-Sketch [MP14, PW11, P11]I Genetic testing: first asymptotic gain using adaptivity [IPW11, PW13]

Lower boundsI Based on Gaussian channel capacity: tight bounds, extensible to

adaptive settings.I Based on communication complexity: extends to `1 setting.

Thank You

Eric Price (MIT) Sparse Recovery and Fourier Sampling 37 / 37

Page 180: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

The Future

Make sparse Fourier applicable to more problems

I Better sample complexityI Incorporate stronger notions of structure

Tight constants in compressive sensing

I Analogous to channel capacity in coding theory.I Lower bound techniques, from information theory, should be strong

enough.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 38 / 37

Page 181: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

The Future

Make sparse Fourier applicable to more problemsI Better sample complexity

I Incorporate stronger notions of structureTight constants in compressive sensing

I Analogous to channel capacity in coding theory.I Lower bound techniques, from information theory, should be strong

enough.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 38 / 37

Page 182: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

The Future

Make sparse Fourier applicable to more problemsI Better sample complexityI Incorporate stronger notions of structure

Tight constants in compressive sensing

I Analogous to channel capacity in coding theory.I Lower bound techniques, from information theory, should be strong

enough.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 38 / 37

Page 183: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

The Future

Make sparse Fourier applicable to more problemsI Better sample complexityI Incorporate stronger notions of structure

Tight constants in compressive sensing

I Analogous to channel capacity in coding theory.I Lower bound techniques, from information theory, should be strong

enough.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 38 / 37

Page 184: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

The Future

Make sparse Fourier applicable to more problemsI Better sample complexityI Incorporate stronger notions of structure

Tight constants in compressive sensingI Analogous to channel capacity in coding theory.

I Lower bound techniques, from information theory, should be strongenough.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 38 / 37

Page 185: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

The Future

Make sparse Fourier applicable to more problemsI Better sample complexityI Incorporate stronger notions of structure

Tight constants in compressive sensingI Analogous to channel capacity in coding theory.I Lower bound techniques, from information theory, should be strong

enough.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 38 / 37

Page 186: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Eric Price (MIT) Sparse Recovery and Fourier Sampling 39 / 37

Page 187: Sparse Recovery and Fourier Samplingecprice/slides/thesis.pdf2 Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion Eric Price (MIT) Sparse Recovery and Fourier

Eric Price (MIT) Sparse Recovery and Fourier Sampling 40 / 37