lecture 2: tikhonov-regularization - uni-frankfurt.deharrach/talks/2014bangalore_lecture2.pdf ·...

13
Lecture 2: Tikhonov-Regularization Bastian von Harrach [email protected] Chair of Optimization and Inverse Problems, University of Stuttgart, Germany Advanced Instructional School on Theoretical and Numerical Aspects of Inverse Problems TIFR Centre For Applicable Mathematics Bangalore, India, June 16–28, 2014. B. Harrach: Lecture 2: Tikhonov-Regularization

Upload: others

Post on 31-May-2020

10 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Lecture 2: Tikhonov-Regularization - uni-frankfurt.deharrach/talks/2014Bangalore_Lecture2.pdf · Lecture 2: Tikhonov-Regularization Bastian von Harrach harrach@math.uni-stuttgart.de

Lecture 2: Tikhonov-Regularization

Bastian von [email protected]

Chair of Optimization and Inverse Problems, University of Stuttgart, Germany

Advanced Instructional School onTheoretical and Numerical Aspects of Inverse Problems

TIFR Centre For Applicable MathematicsBangalore, India, June 16–28, 2014.

B. Harrach: Lecture 2: Tikhonov-Regularization

Page 2: Lecture 2: Tikhonov-Regularization - uni-frankfurt.deharrach/talks/2014Bangalore_Lecture2.pdf · Lecture 2: Tikhonov-Regularization Bastian von Harrach harrach@math.uni-stuttgart.de

Setting

Let

I X and Y be Hilbert spaces

I A ∈ L(X ,Y ), i.e., A : X → Y is linear and continuous

I A be injective, i.e., there exists left inverse

A−1 : R(A) ⊆ Y → X .

I A−1 linear but possibly discontinuous (unbounded)

B. Harrach: Lecture 2: Tikhonov-Regularization

Page 3: Lecture 2: Tikhonov-Regularization - uni-frankfurt.deharrach/talks/2014Bangalore_Lecture2.pdf · Lecture 2: Tikhonov-Regularization Bastian von Harrach harrach@math.uni-stuttgart.de

Setting

Consider

I true solution x ∈ X ,

I exact measurements y = Ax ∈ Y ,

I noisy measurements y δ ∈ Y ,∥∥y δ − y

∥∥ ≤ δ.

Given exact measurements y , we could calculate x = A−1y .But how can we approximate x from noisy measurements y δ?

Problems:

I A−1y δ may not be well-defined (for y δ 6∈ R(A))

I A−1 discontinuous A−1y δ 6→ A−1y = x for δ → 0.

I A−1 unbounded. Possibly,∥∥A−1y δ∥∥

X→∞

B. Harrach: Lecture 2: Tikhonov-Regularization

Page 4: Lecture 2: Tikhonov-Regularization - uni-frankfurt.deharrach/talks/2014Bangalore_Lecture2.pdf · Lecture 2: Tikhonov-Regularization Bastian von Harrach harrach@math.uni-stuttgart.de

Goal

Can we approximate x = A−1y from noisy measurements y δ ≈ y?

Goal: Find reconstruction function R(y δ, δ) so that

R(y δ, δ)→ A−1y = x for δ → 0.

Note: R(y δ, δ) := A−1y δ does not work!

In this lecture: R(y δ, δ) := (A∗A + δI )−1A∗y δ works, i.e.,

(A∗A + δI )−1A∗y δ → A−1y = x .

B. Harrach: Lecture 2: Tikhonov-Regularization

Page 5: Lecture 2: Tikhonov-Regularization - uni-frankfurt.deharrach/talks/2014Bangalore_Lecture2.pdf · Lecture 2: Tikhonov-Regularization Bastian von Harrach harrach@math.uni-stuttgart.de

Motivation

Can we approximate x = A−1y from noisy measurements y δ ≈ y?

I Standard approach ( xδ = A−1y δ)

Minimize data-fit (residuum)∥∥∥Ax − y δ

∥∥∥Y

!

I Tikhonov regularization:

Minimize∥∥∥Ax − y δ

∥∥∥2Y

+ α ‖x‖2X !

with regularization parameter α > 0

α small solution will fit measurements well,α large solution will be regular (small norm).

B. Harrach: Lecture 2: Tikhonov-Regularization

Page 6: Lecture 2: Tikhonov-Regularization - uni-frankfurt.deharrach/talks/2014Bangalore_Lecture2.pdf · Lecture 2: Tikhonov-Regularization Bastian von Harrach harrach@math.uni-stuttgart.de

MotivationTikhonov regularization:

Minimize∥∥∥Ax − y δ

∥∥∥2Y

+ α ‖x‖2X !

Equivalent formulation:

Minimize

∥∥∥∥( A√αI

)x −

(y δ

0

)∥∥∥∥2X×Y

=

∥∥∥∥( Ax − y δ√αx

)∥∥∥∥2X×Y

!

Formal use of normal equations yields(A∗

√αI)( A√

αI

)x =

(A∗

√αI)( y δ

0

),

and thus(A∗A + αI )x = A∗y δ.

B. Harrach: Lecture 2: Tikhonov-Regularization

Page 7: Lecture 2: Tikhonov-Regularization - uni-frankfurt.deharrach/talks/2014Bangalore_Lecture2.pdf · Lecture 2: Tikhonov-Regularization Bastian von Harrach harrach@math.uni-stuttgart.de

Invertibility of A∗A+ αI

Theorem 2.1. Let A ∈ L(X ,Y ). For each α > 0, the operators

A∗A + αI ∈ L(X ) and AA∗ + αI ∈ L(Y )

are continuously invertible and they fulfill∥∥(A∗A + αI )−1∥∥L(X )

≤ 1

α,∥∥(AA∗ + αI )−1

∥∥L(Y )

≤ 1

α.

B. Harrach: Lecture 2: Tikhonov-Regularization

Page 8: Lecture 2: Tikhonov-Regularization - uni-frankfurt.deharrach/talks/2014Bangalore_Lecture2.pdf · Lecture 2: Tikhonov-Regularization Bastian von Harrach harrach@math.uni-stuttgart.de

Minimizer of Tikhonov Functional

Theorem 2.2. Let A ∈ L(X ,Y ). xδα := (A∗A + αI )−1A∗y δ is theunique minimizer of the Tikhonov functional

Jα(x) :=∥∥∥Ax − y δ

∥∥∥2Y

+ α ‖x‖2X

B. Harrach: Lecture 2: Tikhonov-Regularization

Page 9: Lecture 2: Tikhonov-Regularization - uni-frankfurt.deharrach/talks/2014Bangalore_Lecture2.pdf · Lecture 2: Tikhonov-Regularization Bastian von Harrach harrach@math.uni-stuttgart.de

An auxiliary result

Theorem 2.3. Let A ∈ L(X ,Y ) be injective.

(a) R(A∗A) is dense in X

(b) If a sequence (xk)k∈N ⊂ X fulfills

A∗Axk → A∗Ax and ‖xk‖X ≤ ‖x‖X

then xk → x .

B. Harrach: Lecture 2: Tikhonov-Regularization

Page 10: Lecture 2: Tikhonov-Regularization - uni-frankfurt.deharrach/talks/2014Bangalore_Lecture2.pdf · Lecture 2: Tikhonov-Regularization Bastian von Harrach harrach@math.uni-stuttgart.de

Convergence of Tikhonov regularization

Theorem 2.4. Let

I A ∈ L(X ,Y ) be injective (with a possibly unbounded inverse),

I Ax = y

I (y δ)δ>0 ⊆ Y be noisy measurements with∥∥y δ − y

∥∥Y≤ δ.

If we choose the regularization parameter so that

α(δ)→ 0 andδ2

α(δ)→ 0,

then(A∗A + αI )−1A∗y δ → x for δ → 0.

B. Harrach: Lecture 2: Tikhonov-Regularization

Page 11: Lecture 2: Tikhonov-Regularization - uni-frankfurt.deharrach/talks/2014Bangalore_Lecture2.pdf · Lecture 2: Tikhonov-Regularization Bastian von Harrach harrach@math.uni-stuttgart.de

Image deblurring

x y = Ax y δ

A−1y δ

(A∗A + δI )−1A∗y δ

B. Harrach: Lecture 2: Tikhonov-Regularization

Page 12: Lecture 2: Tikhonov-Regularization - uni-frankfurt.deharrach/talks/2014Bangalore_Lecture2.pdf · Lecture 2: Tikhonov-Regularization Bastian von Harrach harrach@math.uni-stuttgart.de

Computerized tomography

x y = Ax y δ

A−1y δ

(A∗A + δI )−1A∗y δ

B. Harrach: Lecture 2: Tikhonov-Regularization

Page 13: Lecture 2: Tikhonov-Regularization - uni-frankfurt.deharrach/talks/2014Bangalore_Lecture2.pdf · Lecture 2: Tikhonov-Regularization Bastian von Harrach harrach@math.uni-stuttgart.de

Conclusions and remarksConclusions

I For ill-posed inverse problems, the best data-fit solutionsgenerally do not converge against the true solution.

I The regularized solutions do converge against the true solution.

More sophisticated parameter choice rule

I Discrepancy principle: Choose α such that∥∥Axδα − y δ

∥∥ ≈ δStrategies for non-linear inverse problems F (x) = y :

I First linearize, then regularize.

I First regularize, then linearize.

A-priori information

I Regularization can be used to incorporate a-priori knowledge(promote sparsity or sharp edges, include stochastic priors, etc.)

B. Harrach: Lecture 2: Tikhonov-Regularization