eleg 867 - compressive sensing and sparse signal...

42
ELEG 867 - Compressive Sensing and Sparse Signal Representations Gonzalo R. Arce Depart. of Electrical and Computer Engineering University of Delaware Fall 2011 Compressive Sensing G. Arce Fall, 2011 1 / 42

Upload: others

Post on 15-Aug-2021

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: ELEG 867 - Compressive Sensing and Sparse Signal ...arce/files/Courses/CSandSparseRep/CS-ChapterIII.pdfρ = S/M is a normalized measure of the sparsity. Red region - unsuccessful recovery

ELEG 867 - Compressive Sensing and SparseSignal Representations

Gonzalo R. ArceDepart. of Electrical and Computer Engineering

University of Delaware

Fall 2011

Compressive Sensing G. Arce Fall, 2011 1 / 42

Page 2: ELEG 867 - Compressive Sensing and Sparse Signal ...arce/files/Courses/CSandSparseRep/CS-ChapterIII.pdfρ = S/M is a normalized measure of the sparsity. Red region - unsuccessful recovery

Outline

Compressive sensing signal reconstruction

Geometry of CSReconstruction Algorithms

Matching PursuitOrthogonal Matching PursuitIterative Hard ThresholdingWeighted Median Regression

Compressive Sensing G. Arce Fall, 2011 2 / 42

Page 3: ELEG 867 - Compressive Sensing and Sparse Signal ...arce/files/Courses/CSandSparseRep/CS-ChapterIII.pdfρ = S/M is a normalized measure of the sparsity. Red region - unsuccessful recovery

Compressive Sensing Signal Reconstruction

Goal:Recover signalx from measurementsy

Problem:Random projectionΦ not full rank (ill-posed inverseproblem)

Solution:Exploit the sparse/compressible geometry of acquiredsignalx

y Φ x

Compressive Sensing G. Arce CS Signal Reconstruction Fall, 2011 3 / 42

Page 4: ELEG 867 - Compressive Sensing and Sparse Signal ...arce/files/Courses/CSandSparseRep/CS-ChapterIII.pdfρ = S/M is a normalized measure of the sparsity. Red region - unsuccessful recovery

Suppose there is aS-sparse solution toy = Φx

Combinatorial optimization problem

(P0) minx

‖x‖0 s.t. Φx = y

Convex optimization problem

(P1) minx

‖x‖1 s.t. Φx = y

If 2δ3s + δ2s ≤ 1, the solutions(P0) and(P1) are the same†.† E. Candes.”Compressive Sampling”. Proc. Intern. Congress of Math., China, April 2006.

Compressive Sensing G. Arce CS Signal Reconstruction Fall, 2011 4 / 42

Page 5: ELEG 867 - Compressive Sensing and Sparse Signal ...arce/files/Courses/CSandSparseRep/CS-ChapterIII.pdfρ = S/M is a normalized measure of the sparsity. Red region - unsuccessful recovery

The Geometry of CS

Sparse signals have few non-zero coefficients

1-sparse 2-sparse

R3 R3X 1

X 2

X 3

X 1

X 2

X 3

Compressive Sensing G. Arce Geometry of CS Fall, 2011 5 / 42

Page 6: ELEG 867 - Compressive Sensing and Sparse Signal ...arce/files/Courses/CSandSparseRep/CS-ChapterIII.pdfρ = S/M is a normalized measure of the sparsity. Red region - unsuccessful recovery

ℓ0 Recovery

Reconstruction should be

Consistent with the model:x as sparse as possible min‖x‖0

Consistent with the measurements:y = ΦxX 1

X 3

R3

X 2x

x : y = Φx

Compressive Sensing G. Arce Geometry of CS Fall, 2011 6 / 42

Page 7: ELEG 867 - Compressive Sensing and Sparse Signal ...arce/files/Courses/CSandSparseRep/CS-ChapterIII.pdfρ = S/M is a normalized measure of the sparsity. Red region - unsuccessful recovery

ℓ0 Recovery

minx

‖x‖0 s.t. Φx = y

‖x‖0 number of nonzero elements

Sparsest signal consistent with the measurementsy

Requires onlyM << N measurements

Combinatorial NP-hard problem: forx ∈ RN with sparsityS, the

complexity isO(NS)

Too slow for implementation

Compressive Sensing G. Arce Geometry of CS Fall, 2011 7 / 42

Page 8: ELEG 867 - Compressive Sensing and Sparse Signal ...arce/files/Courses/CSandSparseRep/CS-ChapterIII.pdfρ = S/M is a normalized measure of the sparsity. Red region - unsuccessful recovery

ℓ1 Recovery

A more realistic model

minx

‖x‖1 s.t. Φx = y

Theℓ1 norm also induces sparsity.

The constraint is given byy = Φx.

x : y = Φx

x

X 1

X 3

R3

X 2

Compressive Sensing G. Arce Geometry of CS Fall, 2011 8 / 42

Page 9: ELEG 867 - Compressive Sensing and Sparse Signal ...arce/files/Courses/CSandSparseRep/CS-ChapterIII.pdfρ = S/M is a normalized measure of the sparsity. Red region - unsuccessful recovery

Phase Transition Diagram

In theℓ1 minimization, there is a defined region on(M/N, S/M) whichensures successful recovery

0.1

0.2

0.3

0.4

0.5

0.1 0.2 0.3 0.4 0.5

0.6

0.6

0.6

0.7

0.7

0.7

0.8

0.8

0.8

0.9

0.9

0.9 1

1

1

0.1

0.2

0.3

0.4

0.5

δ = M/N is a normalized measure of the problem indeterminacy.

ρ = S/M is a normalized measure of the sparsity.

Red region- unsuccessful recovery or exact reconstruction typicallyfails.

Blue region- successful recovery or exact reconstruction typicallyoccurs.

Compressive Sensing G. Arce Geometry of CS Fall, 2011 9 / 42

Page 10: ELEG 867 - Compressive Sensing and Sparse Signal ...arce/files/Courses/CSandSparseRep/CS-ChapterIII.pdfρ = S/M is a normalized measure of the sparsity. Red region - unsuccessful recovery

ℓ2 Recovery

minx

‖x‖2 s.t. Φx = y

Least square solutionx = (ΦTΦ)−1ΦTy

Solved by using quadratic programming:X Least squares solutionX Interior-point methods

Solution is almost never sparse

X 2

R3

X 1

x

x'

X 3

x : y = Φx

Compressive Sensing G. Arce Geometry of CS Fall, 2011 10 / 42

Page 11: ELEG 867 - Compressive Sensing and Sparse Signal ...arce/files/Courses/CSandSparseRep/CS-ChapterIII.pdfρ = S/M is a normalized measure of the sparsity. Red region - unsuccessful recovery

ℓ2 Recovery

Problem:smallℓ2 does not imply sparsity

x

x'

x′ has smallℓ2 norm but it is not sparse

Compressive Sensing G. Arce Geometry of CS Fall, 2011 11 / 42

Page 12: ELEG 867 - Compressive Sensing and Sparse Signal ...arce/files/Courses/CSandSparseRep/CS-ChapterIII.pdfρ = S/M is a normalized measure of the sparsity. Red region - unsuccessful recovery

CS Example in the Time Domain

Random

Projection

Nonlinear

Reconstruction

min ||x||1 s.t.

Time domain sparse signal

Error signal

Φ y = x

Φ y = x

Compressive Sensing G. Arce Geometry of CS Fall, 2011 12 / 42

Page 13: ELEG 867 - Compressive Sensing and Sparse Signal ...arce/files/Courses/CSandSparseRep/CS-ChapterIII.pdfρ = S/M is a normalized measure of the sparsity. Red region - unsuccessful recovery

CS Example in the Wavelet Domain

Reconstruction of an image (N = 1 megapixel) sparse(S = 25, 000) in the wavelet domain fromM = 96, 000incoherent measurements.†

Original (25K wavelets) Recovered Image† E. J. Candes and J. Romberg ”Sparsity and Incoherence in Compressive Sampling.” Inverse Problems.

vol.23, pp.969-985. 2006.

Compressive Sensing G. Arce Geometry of CS Fall, 2011 13 / 42

Page 14: ELEG 867 - Compressive Sensing and Sparse Signal ...arce/files/Courses/CSandSparseRep/CS-ChapterIII.pdfρ = S/M is a normalized measure of the sparsity. Red region - unsuccessful recovery

Reconstruction Algorithms

Different formulations and implementations have been proposedto find the sparsestx subject toy = Φx

Difficult to compare results obtained by different methodsThose are broadly classified in:X Regularization formulations (Replace combinatorial problem with

convex optimization)X Greedy algorithms (Iterative refinement of a sparse solution)X Bayesian framework (Assume prior distribution of sparse

coefficients)

Compressive Sensing G. Arce Reconstruction Algorithms Fall, 2011 14 / 42

Page 15: ELEG 867 - Compressive Sensing and Sparse Signal ...arce/files/Courses/CSandSparseRep/CS-ChapterIII.pdfρ = S/M is a normalized measure of the sparsity. Red region - unsuccessful recovery

Greedy Algorithms

Iterative algorithms that select an optimal subset of the desired signalxat each iteration. Some examples:

Matching Pursuit (MP) minx ‖x‖0 s.t. Φx = y

Orthogonal Matching Pursuit(OMP) minx ‖x‖0 s.t. Φx = y

Compressive Sampling Matching Pursuit(CoSaMP) minx ‖y − Φx‖2

2 s.t. ‖x‖0 ≤ S

Iterative Hard Thresholding (ITH) minx ‖y − Φx‖22 s.t. ‖x‖0 ≤ S

Weighted Median Regression minx ‖y − Φx‖1 + λ‖x‖0

Those methods build an approximation of the sparse signal ateach iteration.

Compressive Sensing G. Arce Reconstruction Algorithms Fall, 2011 15 / 42

Page 16: ELEG 867 - Compressive Sensing and Sparse Signal ...arce/files/Courses/CSandSparseRep/CS-ChapterIII.pdfρ = S/M is a normalized measure of the sparsity. Red region - unsuccessful recovery

Matching Pursuit (MP)

Sparse approximation algorithm used to find the solution to

minx

‖x‖0 s.t. Φx = y

Key ideas:

Measurementsy = Φx are composed of sum ofS columns ofΦ.

The algorithm needs to identify whichS columns contribute toy.y x

S

Compressive Sensing G. Arce Matching Pursuit Fall, 2011 16 / 42

Page 17: ELEG 867 - Compressive Sensing and Sparse Signal ...arce/files/Courses/CSandSparseRep/CS-ChapterIII.pdfρ = S/M is a normalized measure of the sparsity. Red region - unsuccessful recovery

Matching Pursuit (MP) Algorithm

Input and Observation vectoryInitialize Measurement MatrixΦParameters Initial Solutionx(0) = 0

Initial residualr(0) = y

Iteration k

Compute the current error:c(k)j = 〈φj, r(k)〉

Identify the indexj such that:j = maxj |c(k)j |

Updatex(k)

j= x(k−1)

j+ c(k)

j

Updater(k) = r(k−1) − c(k)

jφj

D. Donoho et. al, SparseLab Version 2.0. 2007.

Compressive Sensing G. Arce Matching Pursuit Fall, 2011 17 / 42

Page 18: ELEG 867 - Compressive Sensing and Sparse Signal ...arce/files/Courses/CSandSparseRep/CS-ChapterIII.pdfρ = S/M is a normalized measure of the sparsity. Red region - unsuccessful recovery

MP Reconstruction Example

0 50 100 150 200 250 300−1

−0.8

−0.6

−0.4

−0.2

0

0.2

0.4

0.6

0.8

1

reconstruc. signal

Iteration 0

0 50 100 150−0.8

−0.6

−0.4

−0.2

0

0.2

0.4

0.6

residue - Iteration 0

0 50 100 150 200 250 3000

0.2

0.4

0.6

0.8

1

1.2

1.4

1.6

1.8

reconstruc. signal

Iteration 5

0 50 100 150−0.2

−0.15

−0.1

−0.05

0

0.05

0.1

0.15

residue - Iteration 5

Compressive Sensing G. Arce Matching Pursuit Fall, 2011 18 / 42

Page 19: ELEG 867 - Compressive Sensing and Sparse Signal ...arce/files/Courses/CSandSparseRep/CS-ChapterIII.pdfρ = S/M is a normalized measure of the sparsity. Red region - unsuccessful recovery

MP Reconstruction Example

0 50 100 150 200 250 3000

0.2

0.4

0.6

0.8

1

1.2

1.4

1.6

1.8

0 50 100 150 200 250 3000

0.2

0.4

0.6

0.8

1

1.2

1.4

1.6

1.8

reconstruc. signal

Iteration 32

Original Signal

0 50 100 150−6

−4

−2

0

2

4

6

8x 10

−5

Residue Iteration 32

Compressive Sensing G. Arce Matching Pursuit Fall, 2011 19 / 42

Page 20: ELEG 867 - Compressive Sensing and Sparse Signal ...arce/files/Courses/CSandSparseRep/CS-ChapterIII.pdfρ = S/M is a normalized measure of the sparsity. Red region - unsuccessful recovery

Orthogonal Matching Pursuit (MP)

Sparse approximation algorithm used to find the solution to

minx

‖x‖0 s.t. Φx = y

Key ideas:

Measurementsy = Φx are composed of sum ofS columns ofΦ.

The algorithm needs to identify whichS columns contribute toy.y x

S

Compressive Sensing G. Arce Orthogonal Matching Pursuit Fall, 2011 20 / 42

Page 21: ELEG 867 - Compressive Sensing and Sparse Signal ...arce/files/Courses/CSandSparseRep/CS-ChapterIII.pdfρ = S/M is a normalized measure of the sparsity. Red region - unsuccessful recovery

Orthogonal Matching Pursuit (OMP) Algorithm

Input and Observation vectoryInitialize Measurement MatrixΦParameters Initial Solutionx(0) = 0

Initial Solution SupportS(0)=support{x(0)} = 0Initial residualr(0) = y

Iteration k Compute the current error:c(k)j = 〈φj, r(k)〉

Identify the indexj such that:j = maxj |c(k)j |

Update the supportS(k)=S(k−1)∪jUpdate the matrixΦS(k) = [0, ..., φi, ..., 0, ..., φj, ..., 0]Update the solutionx(k) = (ΦT

S(k)ΦS(k))−1ΦTS(k)y

Updater(k) = y − ΦS(k) x(k)

D. Donoho et. al, SparseLab Version 2.0. 2007.

Compressive Sensing G. Arce Orthogonal Matching Pursuit Fall, 2011 21 / 42

Page 22: ELEG 867 - Compressive Sensing and Sparse Signal ...arce/files/Courses/CSandSparseRep/CS-ChapterIII.pdfρ = S/M is a normalized measure of the sparsity. Red region - unsuccessful recovery

Remarks on the OMP Algorithm

S(k) is the support ofx at the iterationk andΦS(k) is the matrix thatcontains the columns fromΦ that belongs to this support.

The updated solution gives thex(k) that solves the minimizationproblem‖ΦS(k)x − y‖2

2. The solution is given by

ΦTS(k)(ΦS(k)x − y) = 0

−ΦTS(k)r(k) = 0

This solution shows that the columns inΦTS(k) are orthogonal tor(k).

Compressive Sensing G. Arce Orthogonal Matching Pursuit Fall, 2011 22 / 42

Page 23: ELEG 867 - Compressive Sensing and Sparse Signal ...arce/files/Courses/CSandSparseRep/CS-ChapterIII.pdfρ = S/M is a normalized measure of the sparsity. Red region - unsuccessful recovery

OMP Reconstruction Example

0 50 100 150 200 250 300−1

−0.8

−0.6

−0.4

−0.2

0

0.2

0.4

0.6

0.8

1

0 50 100 150−5

−4

−3

−2

−1

0

1

2

3

4

5

reconstruc. signal

Iteration 0residue - Iteration 0

0 50 100 150 200 250 300−1

−0.8

−0.6

−0.4

−0.2

0

0.2

0.4

0.6

0.8

1

reconstruc. signal

Iteration 5

0 50 100 150−3

−2

−1

0

1

2

3

4

residue - Iteration 5

Compressive Sensing G. Arce Orthogonal Matching Pursuit Fall, 2011 23 / 42

Page 24: ELEG 867 - Compressive Sensing and Sparse Signal ...arce/files/Courses/CSandSparseRep/CS-ChapterIII.pdfρ = S/M is a normalized measure of the sparsity. Red region - unsuccessful recovery

OMP Reconstruction Example

0 50 100 150 200 250 300−1

−0.8

−0.6

−0.4

−0.2

0

0.2

0.4

0.6

0.8

1

0 50 100 150−1

−0.8

−0.6

−0.4

−0.2

0

0.2

0.4

0.6

0.8

1

reconstruc. signal

Iteration � residue - Iteration 9

0 50 100 150 200 250 300−1

−0.8

−0.6

−0.4

−0.2

0

0.2

0.4

0.6

0.8

1

0 50 100 150−1

−0.5

0

0.5

1

1.5x 10

−15

reconstruc. signal

Iteration 10residue - Iteration 10

Compressive Sensing G. Arce Orthogonal Matching Pursuit Fall, 2011 24 / 42

Page 25: ELEG 867 - Compressive Sensing and Sparse Signal ...arce/files/Courses/CSandSparseRep/CS-ChapterIII.pdfρ = S/M is a normalized measure of the sparsity. Red region - unsuccessful recovery

Iterative Hard Thresholding

Sparse approximation algorithm that solves the problem

minx

‖y − Φx‖22 + λ‖x‖0.

Iterative algorithm that computes a descent direction given by thegradient of theℓ2-norm.

The solution at each iteration is given by finding:

x(k+1) = Hλ(x(k) + ΦT(y − Φx(k)))

Hλ is a non-linear operator that only retains the coefficients withthe largest magnitude.

† T. Blumensath and M. Davies. ”Iterative Hard Thresholding for Compressed Sensing.” Journal of Appl. Comp. Harm.Anal. vol. 27, no. 3, pp. 265-274, 2009.

Compressive Sensing G. Arce Iterative Hard Thresholding Fall, 2011 25 / 42

Page 26: ELEG 867 - Compressive Sensing and Sparse Signal ...arce/files/Courses/CSandSparseRep/CS-ChapterIII.pdfρ = S/M is a normalized measure of the sparsity. Red region - unsuccessful recovery

Iterative Hard Thresholding

Input and Observation vectoryInitialize Measurement MatrixΦParameters x(0) = 0

Iteration kCompute an update:a(k+1) = x(k) +ΦT(y − Φx(k)))

Select the largest elements:x(k+1) = Hλ(a(k+1)).Pull other elements toward zero.

D. Donoho et. al, SparseLab Version 2.0. 2007.

Compressive Sensing G. Arce Iterative Hard Thresholding Fall, 2011 26 / 42

Page 27: ELEG 867 - Compressive Sensing and Sparse Signal ...arce/files/Courses/CSandSparseRep/CS-ChapterIII.pdfρ = S/M is a normalized measure of the sparsity. Red region - unsuccessful recovery

ITH Reconstruction Example

0 50 100 150 200 250 300−0.8

−0.6

−0.4

−0.2

0

0.2

0.4

0.6

reconstruc. signal

Iteration 1

0 50 100 150−5

−4

−3

−2

−1

0

1

2

3

4

5

residue - Iteration 1

0 50 100 150 200 250 300−1

−0.8

−0.6

−0.4

−0.2

0

0.2

0.4

0.6

0.8

1

reconstruc. signal

Iteration 5

0 50 100 150−0.04

−0.03

−0.02

−0.01

0

0.01

0.02

0.03

0.04

0.05

residue - Iteration 5

Compressive Sensing G. Arce Iterative Hard Thresholding Fall, 2011 27 / 42

Page 28: ELEG 867 - Compressive Sensing and Sparse Signal ...arce/files/Courses/CSandSparseRep/CS-ChapterIII.pdfρ = S/M is a normalized measure of the sparsity. Red region - unsuccessful recovery

ITH Reconstruction Example

0 50 100 150 200 250 300−1

−0.8

−0.6

−0.4

−0.2

0

0.2

0.4

0.6

0.8

1

0 50 100 150−6

−4

−2

0

2

4

6

8x 10

−4

0 50 100 150 200 250 300−1

−0.8

−0.6

−0.4

−0.2

0

0.2

0.4

0.6

0.8

1

Original Signal reconstruc. signal

Iteration 10

residue - Iteration 10

Compressive Sensing G. Arce Iterative Hard Thresholding Fall, 2011 28 / 42

Page 29: ELEG 867 - Compressive Sensing and Sparse Signal ...arce/files/Courses/CSandSparseRep/CS-ChapterIII.pdfρ = S/M is a normalized measure of the sparsity. Red region - unsuccessful recovery

Weighted Median Regression ReconstructionAlgorithm

When the random measurements are corrupted by noise the mostwidely used CS reconstruction algorithms solve the problem

minx

‖y − Φx‖22 + τ‖x‖1.

whereτ is the regularization parameter that balances the conflictingtask of minimizing theℓ2 norm while yielding a sparse solution.

As τ increases the solution is sparser.

As τ decreases the solution approximates to least square solution.

Compressive Sensing G. Arce Weighted Median Regression Fall, 2011 29 / 42

Page 30: ELEG 867 - Compressive Sensing and Sparse Signal ...arce/files/Courses/CSandSparseRep/CS-ChapterIII.pdfρ = S/M is a normalized measure of the sparsity. Red region - unsuccessful recovery

In the solution to the problem

minx

‖y − Φx‖22 + τ‖x‖1.

Theℓ2 term is the data fitting term, induced by the Gaussianassumption of the noise contamination.

Theℓ1 term is the sparsity promoting term.

However, theℓ2 term leads to the least square solution which is knownto be very sensitive to high-noise level and outliers present in thecontamination.

Compressive Sensing G. Arce Weighted Median Regression Fall, 2011 30 / 42

Page 31: ELEG 867 - Compressive Sensing and Sparse Signal ...arce/files/Courses/CSandSparseRep/CS-ChapterIII.pdfρ = S/M is a normalized measure of the sparsity. Red region - unsuccessful recovery

Example of signal reconstruction when outliers are presentin therandom projections

Compressive Sensing G. Arce Weighted Median Regression Fall, 2011 31 / 42

Page 32: ELEG 867 - Compressive Sensing and Sparse Signal ...arce/files/Courses/CSandSparseRep/CS-ChapterIII.pdfρ = S/M is a normalized measure of the sparsity. Red region - unsuccessful recovery

To mitigate the effect of the impulsive noise in the compressivemeasurements, a more robust norm for the data-fitting term is

minx

‖y − Φx‖1︸ ︷︷ ︸+ λ‖x‖0︸ ︷︷ ︸ (1)

Data fitting Sparsity

LAD offers robustness to a broad class of noise, in particular to heavy tailnoise.

Optimum under the maximum likelihood principle when the underlyingcontamination follows a Laplacian-distributed model.

† J. L. Paredes and G. Arce. ”Compressive Sensing Signal Reconstruction by Weighted Median Regression Estimate.”ICASSP 2010.

Compressive Sensing G. Arce Weighted Median Regression Fall, 2011 32 / 42

Page 33: ELEG 867 - Compressive Sensing and Sparse Signal ...arce/files/Courses/CSandSparseRep/CS-ChapterIII.pdfρ = S/M is a normalized measure of the sparsity. Red region - unsuccessful recovery

Drawbacks:

Solvingℓ0-LAD optimization isNp-hard problem.

Direct solution is unfeasible even for modest-sized signal.

To solve this multivariable minimization problem:

Solve a sequence of scalar minimization subproblems.

Each subproblem improves the estimate of the solution byminimizing along a selected coordinate with all other coordinatesfixed.

Closed form solution exists for the scalar minimization problem.

Compressive Sensing G. Arce Weighted Median Regression Fall, 2011 33 / 42

Page 34: ELEG 867 - Compressive Sensing and Sparse Signal ...arce/files/Courses/CSandSparseRep/CS-ChapterIII.pdfρ = S/M is a normalized measure of the sparsity. Red region - unsuccessful recovery

Signal reconstruction

x = minx

M∑

i=1

|(y − Φx)i|+ λ‖x‖0

The solution usingCoordinate Descent Methodis given by

xn = minxn

M∑

i=1

|yi −

N∑

j=1,j6=n

φi,jxj − φi,nxn|+ λ|xn|0 +

N∑

j=1,j6=n

λ|xj|0

xn = minxn

M∑

i=1

|φin||yi −

∑Nj=1,j6=n φi,jxj

φin− xn|

︸ ︷︷ ︸+ λ|xn|0︸ ︷︷ ︸

Weighted least absolute deviation Regulariz.

Q(xn)

Compressive Sensing G. Arce Weighted Median Regression Fall, 2011 34 / 42

Page 35: ELEG 867 - Compressive Sensing and Sparse Signal ...arce/files/Courses/CSandSparseRep/CS-ChapterIII.pdfρ = S/M is a normalized measure of the sparsity. Red region - unsuccessful recovery

The solution to the minimization problem is given by

xn = minxn

Q(xn) + λ|xn|0

xn = MEDIAN(|φin| ⋄yi −

∑Nj=1,j6=n φi,jxj

φin) |Mi=1; n = 1, 2, ...,N

At kth iteration:

x(k)n =

{xn, if Q(xn) + λ ≤ Q(0)0, otherwise

Compressive Sensing G. Arce Weighted Median Regression Fall, 2011 35 / 42

Page 36: ELEG 867 - Compressive Sensing and Sparse Signal ...arce/files/Courses/CSandSparseRep/CS-ChapterIII.pdfρ = S/M is a normalized measure of the sparsity. Red region - unsuccessful recovery

The iterative algorithm detects the non-zero entries of thesparse vectorby

Robust signal estimation: computing a rough estimate of thesignal that minimizes the LAD by the weighted median operator.

Basis selection problem: applying a hard-threshold operator onthe estimated values.

Removing the entry’s influence from the observation vector.

Repeating these steps again until a performance criterion isreached.

IterationOriginal Signal

Compressive Sensing G. Arce Weighted Median Regression Fall, 2011 36 / 42

Page 37: ELEG 867 - Compressive Sensing and Sparse Signal ...arce/files/Courses/CSandSparseRep/CS-ChapterIII.pdfρ = S/M is a normalized measure of the sparsity. Red region - unsuccessful recovery

Input and Observation vectoryInitialize Measurement MatrixΦParameters Number of IterationsK0

Iteration counter:k = 1Hard-thresholding value:λ = λi

Estimation atk = 1, x(0) = 0IterationStep A For then-th entry ofx, n = 1, 2, ...,N compute

xn = MEDIAN

{φin ⋄

yi−∑N

j=1;j 6=n φijxj

φin

}|ki=1

x(k)n =

{xn, if Q(xn) + λ ≤ Q(0)0, otherwise,

Step BUpdate the hard-thresholding parameter and the estimationof xx(k+1) = x(k)

λ = λiβk

Step CCheck stopping criteriumIf k ≤ K0 then setk = k + 1 and go to Step A; otherwise, end.

OutputRecovered sparse signalx

Compressive Sensing G. Arce Weighted Median Regression Fall, 2011 37 / 42

Page 38: ELEG 867 - Compressive Sensing and Sparse Signal ...arce/files/Courses/CSandSparseRep/CS-ChapterIII.pdfρ = S/M is a normalized measure of the sparsity. Red region - unsuccessful recovery

Example Signal Reconstruction from Projectionsin Non-Gaussian Noise

Original signal and

noisy projections

CoSaMP L1-Ls

Matching Pursuit Weighted Median

Compressive Sensing G. Arce Weighted Median Regression Fall, 2011 38 / 42

Page 39: ELEG 867 - Compressive Sensing and Sparse Signal ...arce/files/Courses/CSandSparseRep/CS-ChapterIII.pdfρ = S/M is a normalized measure of the sparsity. Red region - unsuccessful recovery

Performance Comparisons

Exact reconstruction for noiseless signals (normalized error is smallerthan 0.1% of the signal energy).

Compressive Sensing G. Arce Weighted Median Regression Fall, 2011 39 / 42

Page 40: ELEG 867 - Compressive Sensing and Sparse Signal ...arce/files/Courses/CSandSparseRep/CS-ChapterIII.pdfρ = S/M is a normalized measure of the sparsity. Red region - unsuccessful recovery

Performance Comparisons

SNR vs. Sparsity for different types of noise

Compressive Sensing G. Arce Weighted Median Regression Fall, 2011 40 / 42

Page 41: ELEG 867 - Compressive Sensing and Sparse Signal ...arce/files/Courses/CSandSparseRep/CS-ChapterIII.pdfρ = S/M is a normalized measure of the sparsity. Red region - unsuccessful recovery

Convex Relaxation Approach:

Theℓ0-LAD problem can be relaxed by solving the following problem

minx

‖y − Φx‖1︸ ︷︷ ︸+ λ‖x‖1︸ ︷︷ ︸ . (2)

Data fitting Sparsity

This problem is convex and can be solved by linear programming.

ℓ1-norm may not induce the necessary sparsity in the solution.

It requires more measurements to achieve a target reconstructionerror.

Compressive Sensing G. Arce Weighted Median Regression Fall, 2011 41 / 42

Page 42: ELEG 867 - Compressive Sensing and Sparse Signal ...arce/files/Courses/CSandSparseRep/CS-ChapterIII.pdfρ = S/M is a normalized measure of the sparsity. Red region - unsuccessful recovery

The solution for the problem

x = minx

M∑

i=1

|yi − Φix|+ λ‖x‖1

is given by

x = MEDIAN(λ ⋄ 0, |φ1| ⋄y1

φ1, ..., |φM| ⋄

yM

φM).

The regularization process induced by theℓ1-term leads to aweighting operation with the zero-valued sample in the WMoperation.

Large values of the regularization parameter implies largevaluefor the weight corresponding to the zero-valued sample (thisfavors sparsity).

The solution is biased.Compressive Sensing G. Arce Weighted Median Regression Fall, 2011 42 / 42