sparse processing / compressed sensingnoiselab.ucsd.edu/ece285/lecture7.pdfsparse processing /...

12
Sparse processing / Compressed sensing Model : y = Ax + n , x is sparse = N × 1 measurements N × M M × 1 sparse signal + Problem : Solve for x Basis pursuit, LASSO (convex objective function) Matching pursuit (greedy method) Sparse Bayesian Learning (non-convex objective function) 1 / 12

Upload: others

Post on 30-May-2020

16 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Sparse processing / Compressed sensingnoiselab.ucsd.edu/ECE285/lecture7.pdfSparse processing / Compressed sensing Model : y = Ax+n; x is sparse = N 1 measurements N M M 1 sparse signal

Sparse processing / Compressed sensing

Model : y = Ax + n , x is sparse

=

N × 1measurements

N ×M

M × 1sparse signal

+

1

• Problem : Solve for x

• Basis pursuit, LASSO (convex objective function)

• Matching pursuit (greedy method)

• Sparse Bayesian Learning (non-convex objective function)

1 / 12

Page 2: Sparse processing / Compressed sensingnoiselab.ucsd.edu/ECE285/lecture7.pdfSparse processing / Compressed sensing Model : y = Ax+n; x is sparse = N 1 measurements N M M 1 sparse signal

Bayesian interpretation of LASSO

MAP estimate via the unconstrained -LASSO- formulation

bxLASSO(µ) = arg minx2CN

ky � Axk22 + µkxk1

Bayes rule:

p(x|y) =p(y|x)p(x)

p(y)

MAP estimate:

bxMAP = arg maxx

ln p(x|y)

= arg maxx

[ln p(y|x) + ln p(x)]

= arg minx

[� ln p(y|x) � ln p(x)]

A. Xenaki (DTU/SIO) Paper F 43/40

2 / 12

Page 3: Sparse processing / Compressed sensingnoiselab.ucsd.edu/ECE285/lecture7.pdfSparse processing / Compressed sensing Model : y = Ax+n; x is sparse = N 1 measurements N M M 1 sparse signal

MAP estimate via the unconstrained -LASSO- formulationBayes rule:

p(x|y) =p(y|x)p(x)

p(y)

MAP estimate:

bxMAP = arg minx

[� ln p(y|x) � ln p(x)]

Gaussian likelihood:

p(y|x) / e�ky�Axk2

2�2

Laplace-like prior:

p(x) /NY

i=1

e�p

(<xi )2+(=xi )

2

⌫ = e�kxk1⌫

MAP estimate (LASSO):

bxMAP=arg minx

⇥ky � Axk2

2 + µkxk1

⇤=bxLASSO(µ), µ =

�2

A. Xenaki (DTU/SIO) Paper F 44/40

3 / 12

Page 4: Sparse processing / Compressed sensingnoiselab.ucsd.edu/ECE285/lecture7.pdfSparse processing / Compressed sensing Model : y = Ax+n; x is sparse = N 1 measurements N M M 1 sparse signal

Prior and Posterior densities (Ex. Murphy)

4 / 12

Page 5: Sparse processing / Compressed sensingnoiselab.ucsd.edu/ECE285/lecture7.pdfSparse processing / Compressed sensing Model : y = Ax+n; x is sparse = N 1 measurements N M M 1 sparse signal

Sparse Bayesian Learning (SBL)

Model : y = Ax + n

Prior : x ∼ N (x; 0,Γ)

Γ = diag(γ1, . . . , γM )

Likelihood : p(y|x) = N (y;Ax, σ2IN )

=

N × 1measurements

N ×M

M × 1sparse signal

+

1

Evidence : p(y) =

xp(y|x)p(x)dx = N (y; 0,Σy)

Σy = σ2IN + AΓAH

SBL solution : Γ̂ = arg maxΓ

p(y)

= arg minΓ

{log |Σy|+ yHΣ−1

y y}

M.E.Tipping, ”Sparse Bayesian learning and the relevance vector machine,” Journal of Machine Learning Research,June 2001.

5 / 12

Page 6: Sparse processing / Compressed sensingnoiselab.ucsd.edu/ECE285/lecture7.pdfSparse processing / Compressed sensing Model : y = Ax+n; x is sparse = N 1 measurements N M M 1 sparse signal

SBL overview

• SBL solution : Γ̂ = arg minΓ

{log |Σy|+ yHΣ−1

y y}

• SBL objective function is non-convex

• Optimization solution is non-unique

• Fixed point update using derivatives, works in practice

• Γ = diag(γ1, . . . , γM )

Update rule : γnewm = γoldm

(||yHΣ−1

y am||22aHmΣ−1

y am

)r

Σy = σ2IN + AΓAH

• Multi snapshot extension : same Γ across snapshots

6 / 12

Page 7: Sparse processing / Compressed sensingnoiselab.ucsd.edu/ECE285/lecture7.pdfSparse processing / Compressed sensing Model : y = Ax+n; x is sparse = N 1 measurements N M M 1 sparse signal

SBL overview

• Posterior : xpost = ΓAHΣ−1y y

• At convergence, γm → 0 for most γm

• Γ controls sparsity, E(|xm|2) = γm

• Different ways to show that SBL gives sparse output

• Automatic determination of sparsity

• Also provides noise estimate σ2

7 / 12

Page 8: Sparse processing / Compressed sensingnoiselab.ucsd.edu/ECE285/lecture7.pdfSparse processing / Compressed sensing Model : y = Ax+n; x is sparse = N 1 measurements N M M 1 sparse signal

Applications to acoustics - Beamforming

• Beamforming

• Direction of arrivals (DOAs) SwellEx 96 data

θ[◦]

-40

-20

0

20

40(a) f =112Hz (b) f =130Hz (c) f =148Hz

θ[◦]

-40

-20

0

20

40(d) f =166Hz (e) f =201Hz (f) f =235Hz

P [dB re max]-20 -10 0

θ[◦]

-40

-20

0

20

40(g) f =283Hz

P [dB re max]-20 -10 0

(h) f =338Hz

P [dB re max]-20 -10 0

(i) f =388Hz

80 snapshots 64 elements CBF low resolution MVDR fails due to coherent arrivals CS high resolution

Gerstoft et al. 2015

8 / 12

Page 9: Sparse processing / Compressed sensingnoiselab.ucsd.edu/ECE285/lecture7.pdfSparse processing / Compressed sensing Model : y = Ax+n; x is sparse = N 1 measurements N M M 1 sparse signal

SBL - Beamforming example

• N = 20 sensors, uniform linear array• Discretize angle space: {−90 : 1 : 90}, M = 181• Dictionary A : columns consist of steering vectors• K = 3 sources, DOAs, [−20,−15, 75]◦, [12, 22, 20] dB• M � N > K

0

10

20

P (

dB) CBF

-90 -70 -50 -30 -10 10 30 50 70 90Angle 3 (°)

0

10

20

. (

dB) SBL

9 / 12

Page 10: Sparse processing / Compressed sensingnoiselab.ucsd.edu/ECE285/lecture7.pdfSparse processing / Compressed sensing Model : y = Ax+n; x is sparse = N 1 measurements N M M 1 sparse signal

SBL - Acoustic hydrophone data processing (from Kai)

CBF

SBL

Eigenrays

Ship of Opportunity

10 / 12

Page 11: Sparse processing / Compressed sensingnoiselab.ucsd.edu/ECE285/lecture7.pdfSparse processing / Compressed sensing Model : y = Ax+n; x is sparse = N 1 measurements N M M 1 sparse signal

Problem with Degrees of Freedom

•  As the number of snapshots (=observations) increases, so does the number of unknown complex source amplitudes

•  PROBLEM: LASSO for multiple snapshots estimates the realizations of

the random complex source amplitudes.

•  However, we would be satisfied if we just estimated their power

γm = E{ |xml|2 } •  Note that γm does not depend on snapshot index l.

Thus SBL is much faster than LASSO for more snapshots.

11 / 12

Page 12: Sparse processing / Compressed sensingnoiselab.ucsd.edu/ECE285/lecture7.pdfSparse processing / Compressed sensing Model : y = Ax+n; x is sparse = N 1 measurements N M M 1 sparse signal

Example CPU Time

RVM-ML RVM-ML1 LASSO

1 10 100 1000Snapshots

0.1

1

10

200

CP

U tim

e (

s)

b) SBLSBL1LASSO

1 10 100 1000Snapshots

0

0.2

0.4

0.6

0.8

1

DO

A R

MS

E [°]

c) SBLSBL1LASSO

LASSO use CVX, CPU∝L2

SBL nearly independent on snapshots

12 / 12