a single-letter characterization of optimal noisy compressed sensing dongning guo dror baron shlomo...

20
A Single-letter Characterization of Optimal Noisy Compressed Sensing Dongning Guo Dror Baron Shlomo Shamai

Post on 20-Dec-2015

226 views

Category:

Documents


4 download

TRANSCRIPT

A Single-letter Characterization of Optimal Noisy

Compressed Sensing

Dongning Guo

Dror Baron

Shlomo Shamai

Setting• Replace samples by more general measurements

based on a few linear projections (inner products)

measurements sparsesignal

# non-zeros

Signal Model

• Signal entry Xn= BnUn

• iid Bn» Bernoulli() sparse

• iid Un» PU

PU

Bernoulli()

Multiplier

PX

Measurement Noise• Measurement process is typically analog• Analog systems add noise, non-linearities, etc.

• Assume Gaussian noise for ease of analysis

• Can be generalized to non-Gaussian noise

• Noiseless measurements denoted y0

• Noise• Noisy measurements• Unit-norm columns SNR=

Noise Model

noiseless

SNR

• Model process as measurement channel

• Measurements provide information!

channel

CS measurement CS decoding

source encoder

channel encoder

channel decoder

source decoder

Allerton 2006 [Sarvotham, Baron, & Baraniuk]

• Theorem: [Sarvotham, Baron, & Baraniuk 2006] For sparse signal with rate-distortion function R(D), lower bound on measurement rate

s.t. SNR and distortion D

• Numerous single-letter bounds – [Aeron, Zhao, & Saligrama]– [Akcakaya and Tarokh]– [Rangan, Fletcher, & Goyal]– [Gastpar & Reeves]– [Wang, Wainwright, & Ramchandran]– [Tune, Bhaskaran, & Hanly]– …

Single-Letter Bounds

Goal: Precise Single-letter Characterization of Optimal CS

What Single-letter Characterization?

•Ultimately what can one say about Xn given Y?

(sufficient statistic)

•Very complicated•Want a simple characterization of its quality•Large-system limit:

channel posterior

Main Result: Single-letter Characterization

• Result1: Conditioned on Xn=xn, the observations (Y,) are statistically equivalent to

easy to compute…

• Estimation quality from (Y,) just as good as noisier scalar observation

degradation

channel posterior

• 2(0,1) is fixed point of

• Take-home point: degraded scalar channel

• Non-rigorous owing to replica method w/ symmetry assumption– used in CDMA detection [Tanaka 2002, Guo & Verdu 2005]

• Related analysis [Rangan, Fletcher, & Goyal 2009] – MMSE estimate (not posterior) using [Guo & Verdu 2005]

– extended to several CS algorithms particularly LASSO

Details

Decoupling

• Result2: Large system limit; any arbitrary (constant) L input elements decouple:

• Take-home point: “interference” from each individual signal entry vanishes

Decoupling Result

Sparse Measurement Matrices

Sparse Measurement Matrices [Baron, Sarvotham, & Baraniuk]

• LDPC measurement matrix (sparse)

• Mostly zeros in ; nonzeros » P

• Each row contains ¼Nq randomly placed nonzeros • Fast matrix-vector multiplication

fast encoding / decoding

sparse matrix

CS Decoding Using BP [Baron, Sarvotham, & Baraniuk]

• Measurement matrix represented by graph • Estimate input iteratively• Implemented via nonparametric BP [Bickson,Sommer,…]

measurements y

signal x

Identical Single-letter Characterization w/BP

• Result3: Conditioned on Xn=xn, the observations (Y,) are statistically equivalent to

• Sparse matrices just as good• BP is asymptotically optimal!

identical degradation

Decoupling Between Two Input Entries (N=500, M=250, =0.1, =10)

density

CS-BP vs Other CS Methods (N=1000, =0.1, q=0.02)

M

MM

SE

CS-BP

Conclusion

• Single-letter characterization of CS

• Decoupling

• Sparse matrices just as good

• Asymptotically optimal CS-BP algorithm