a single-letter characterization of optimal noisy compressed sensing

20
A Single-letter Characterization of Optimal Noisy Compressed Sensing Dongning Guo Dror Baron Shlomo Shamai

Upload: scot

Post on 10-Feb-2016

25 views

Category:

Documents


0 download

DESCRIPTION

A Single-letter Characterization of Optimal Noisy Compressed Sensing. Dongning Guo Dror Baron Shlomo Shamai. Setting. Replace samples by more general measurements based on a few linear projections (inner products). sparse signal. measurements. # non-zeros. Signal Model. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: A Single-letter Characterization of  Optimal  Noisy  Compressed Sensing

A Single-letter Characterization of Optimal Noisy

Compressed Sensing

Dongning Guo

Dror Baron

Shlomo Shamai

Page 2: A Single-letter Characterization of  Optimal  Noisy  Compressed Sensing

Setting• Replace samples by more general measurements

based on a few linear projections (inner products)

measurements sparsesignal

# non-zeros

Page 3: A Single-letter Characterization of  Optimal  Noisy  Compressed Sensing

Signal Model• Signal entry Xn= BnUn

• iid Bn» Bernoulli() sparse• iid Un» PU

PU

Bernoulli()

Multiplier

PX

Page 4: A Single-letter Characterization of  Optimal  Noisy  Compressed Sensing

Measurement Noise• Measurement process is typically analog• Analog systems add noise, non-linearities, etc.

• Assume Gaussian noise for ease of analysis

• Can be generalized to non-Gaussian noise

Page 5: A Single-letter Characterization of  Optimal  Noisy  Compressed Sensing

• Noiseless measurements denoted y0

• Noise• Noisy measurements• Unit-norm columns SNR=

Noise Model

noiseless

SNR

Page 6: A Single-letter Characterization of  Optimal  Noisy  Compressed Sensing

• Model process as measurement channel

• Measurements provide information!

channel

CS measurement CS decoding

source encoder

channel encoder

channel decoder

source decoder

Allerton 2006 [Sarvotham, Baron, & Baraniuk]

Page 7: A Single-letter Characterization of  Optimal  Noisy  Compressed Sensing

• Theorem: [Sarvotham, Baron, & Baraniuk 2006] For sparse signal with rate-distortion function R(D), lower bound on measurement rate

s.t. SNR and distortion D

• Numerous single-letter bounds – [Aeron, Zhao, & Saligrama]– [Akcakaya and Tarokh]– [Rangan, Fletcher, & Goyal]– [Gastpar & Reeves]– [Wang, Wainwright, & Ramchandran]– [Tune, Bhaskaran, & Hanly]– …

Single-Letter Bounds

Page 8: A Single-letter Characterization of  Optimal  Noisy  Compressed Sensing

Goal: Precise Single-letter Characterization of Optimal CS

Page 9: A Single-letter Characterization of  Optimal  Noisy  Compressed Sensing

What Single-letter Characterization?

•Ultimately what can one say about Xn given Y?

(sufficient statistic)•Very complicated•Want a simple characterization of its quality•Large-system limit:

channel posterior

Page 10: A Single-letter Characterization of  Optimal  Noisy  Compressed Sensing

Main Result: Single-letter Characterization• Result1: Conditioned on Xn=xn, the

observations (Y,) are statistically equivalent to

easy to compute…

• Estimation quality from (Y,) just as good as noisier scalar observation

degradation

channel posterior

Page 11: A Single-letter Characterization of  Optimal  Noisy  Compressed Sensing

• 2(0,1) is fixed point of

• Take-home point: degraded scalar channel

• Non-rigorous owing to replica method w/ symmetry assumption– used in CDMA detection [Tanaka 2002, Guo & Verdu 2005]

• Related analysis [Rangan, Fletcher, & Goyal 2009] – MMSE estimate (not posterior) using [Guo & Verdu 2005]– extended to several CS algorithms particularly LASSO

Details

Page 12: A Single-letter Characterization of  Optimal  Noisy  Compressed Sensing

Decoupling

Page 13: A Single-letter Characterization of  Optimal  Noisy  Compressed Sensing

• Result2: Large system limit; any arbitrary (constant) L input elements decouple:

• Take-home point: “interference” from each individual signal entry vanishes

Decoupling Result

Page 14: A Single-letter Characterization of  Optimal  Noisy  Compressed Sensing

Sparse Measurement Matrices

Page 15: A Single-letter Characterization of  Optimal  Noisy  Compressed Sensing

Sparse Measurement Matrices [Baron, Sarvotham, & Baraniuk]

• LDPC measurement matrix (sparse)• Mostly zeros in ; nonzeros » P

• Each row contains ¼Nq randomly placed nonzeros • Fast matrix-vector multiplication

fast encoding / decoding

sparse matrix

Page 16: A Single-letter Characterization of  Optimal  Noisy  Compressed Sensing

CS Decoding Using BP [Baron, Sarvotham, & Baraniuk]

• Measurement matrix represented by graph • Estimate input iteratively• Implemented via nonparametric BP [Bickson,Sommer,…]

measurements y

signal x

Page 17: A Single-letter Characterization of  Optimal  Noisy  Compressed Sensing

Identical Single-letter Characterization w/BP

• Result3: Conditioned on Xn=xn, the observations (Y,) are statistically equivalent to

• Sparse matrices just as good• BP is asymptotically optimal!

identical degradation

Page 18: A Single-letter Characterization of  Optimal  Noisy  Compressed Sensing

Decoupling Between Two Input Entries (N=500, M=250, =0.1, =10)

density

Page 19: A Single-letter Characterization of  Optimal  Noisy  Compressed Sensing

CS-BP vs Other CS Methods (N=1000, =0.1, q=0.02)

M

MM

SE

CS-BP

Page 20: A Single-letter Characterization of  Optimal  Noisy  Compressed Sensing

Conclusion• Single-letter characterization of CS

• Decoupling

• Sparse matrices just as good

• Asymptotically optimal CS-BP algorithm