an introduction and example of intervalneum/glopt/gicolag/talks/...local optimum. (see j. e. dennis...

71
(slide 1) Introduction Theoretical and Practical Considerations Alternate Approaches to Global Optimization General Branch and Bound Algorithms Interval Techniques Bounding Ranges Proving Existence and Uniqueness Practicalities Use of Structure: An Example Obtaining the Objective Lower Bound Gradient-Based Box Rejection Structure in the Kuhn–Tucker Matrix An Introduction and Example of Interval Techniques in Global Optimization R. Baker Kearfott Department of Mathematics University of Louisiana at Lafayette GICOLAG, December, 2006

Upload: others

Post on 16-Oct-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 1)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

An Introduction and Example of IntervalTechniques in Global Optimization

R. Baker Kearfott

Department of MathematicsUniversity of Louisiana at Lafayette

GICOLAG, December, 2006

Page 2: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 2)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Outline

1 IntroductionTheoretical and Practical ConsiderationsAlternate Approaches to Global OptimizationGeneral Branch and Bound Algorithms

2 Interval TechniquesBounding RangesProving Existence and UniquenessPracticalities

3 Use of Structure: An ExampleObtaining the Objective Lower BoundGradient-Based Box RejectionStructure in the Kuhn–Tucker Matrix

Page 3: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 3)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

The General GlobalOptimization Problem

Mathematical Description

minimize ϕ(x)

subject to ci(x) = 0, i = 1, . . . , m1,

gi(x) ≤ 0, i = 1, . . . , m2,

where ϕ : Rn → R and ci , gi : Rn → R.

• We refer to the region defined by the constraints as D.

• Often bounds on the search region are given byx = ([x1, x1], . . . [xn, xn]).

Page 4: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 4)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Local Versus GlobalOptimization

• Global Optimization is not to be confused with localoptimization, often based on a single execution of asteepest-descent-like algorithm.

• Local optimization algorithm developers speak of"globalization," but mean only the design of algorithmvariants that increase the domain of convergence to alocal optimum. (See J. E. Dennis and R. B. Schnabel,Numerical Methods for Unconstrained Optimizationand Nonlinear Least Squares, Prentice–Hall, 1983.)

• Local optimization algorithms contain many heuristics,and do not always work; however, many usefulimplementations exist.

• Repeated application of a local algorithm is typicallyused in practice within global optimization algorithms.

Page 5: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 5)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Outline

1 IntroductionTheoretical and Practical ConsiderationsAlternate Approaches to Global OptimizationGeneral Branch and Bound Algorithms

2 Interval TechniquesBounding RangesProving Existence and UniquenessPracticalities

3 Use of Structure: An ExampleObtaining the Objective Lower BoundGradient-Based Box RejectionStructure in the Kuhn–Tucker Matrix

Page 6: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 6)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Do We Want Local or GlobalOptima?

Examples

• ϕ is the cost of running a (nominally) $50,000,000 permonth plant:The plant manager would like the smallest possibleoperating cost, but would be happy with a 5% lowercost than before.

• ϕ represents the potential energy of a particularconformation of a molecule:The globally lowest value for ϕ gives the mostinformation, but local minima give some information,and finding the global minimum may not be practical.

• A mathematician has reduced a proof to showing thatϕ ≥ 1 everywhere:The global minimum must not only be found, but alsomust be rigorously proven to be so.

Page 7: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 7)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Do We Want Local or GlobalOptima?

Examples

• ϕ is the cost of running a (nominally) $50,000,000 permonth plant:The plant manager would like the smallest possibleoperating cost, but would be happy with a 5% lowercost than before.

• ϕ represents the potential energy of a particularconformation of a molecule:The globally lowest value for ϕ gives the mostinformation, but local minima give some information,and finding the global minimum may not be practical.

• A mathematician has reduced a proof to showing thatϕ ≥ 1 everywhere:The global minimum must not only be found, but alsomust be rigorously proven to be so.

Page 8: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 8)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Do We Want Local or GlobalOptima?

Examples

• ϕ is the cost of running a (nominally) $50,000,000 permonth plant:The plant manager would like the smallest possibleoperating cost, but would be happy with a 5% lowercost than before.

• ϕ represents the potential energy of a particularconformation of a molecule:The globally lowest value for ϕ gives the mostinformation, but local minima give some information,and finding the global minimum may not be practical.

• A mathematician has reduced a proof to showing thatϕ ≥ 1 everywhere:The global minimum must not only be found, but alsomust be rigorously proven to be so.

Page 9: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 9)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Local Versus GlobalOptimization

Global Algorithms

• Global optimization is NP hard, meaning that nogeneral algorithm is known that solves all largeproblems efficiently.

• Thus, barring monumental discoveries, any generalalgorithm will fail for some high-dimensional problems.

• Some low-dimensional problems are also difficult, butmany can be solved.

• Successes have occurred for large problems ofpractical interest by taking advantage of structure.

• Useful results have also been obtained by replacingguarantees by heuristics.

• Advances in computer speed and algorithmconstruction allow ever more practical problems to besolved.

Page 10: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 10)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Outline

1 IntroductionTheoretical and Practical ConsiderationsAlternate Approaches to Global OptimizationGeneral Branch and Bound Algorithms

2 Interval TechniquesBounding RangesProving Existence and UniquenessPracticalities

3 Use of Structure: An ExampleObtaining the Objective Lower BoundGradient-Based Box RejectionStructure in the Kuhn–Tucker Matrix

Page 11: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 11)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Classes of Global OptimizationAlgorithms

• There are two types of algorithms: stochastic anddeterministic.

• Deterministic algorithms can be either rigorous orheuristic.

Page 12: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 12)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Stochastic Algorithms

Monte-Carlo search: Random points are generated in thesearch space. The point with lowest objectivevalue is taken to be the global optimum.

Simulated annealing: is similar to a local optimizationmethod, although larger objective values areaccepted with a probability that decreases asthe algorithm progresses.

Genetic algorithms: Attributes, such as values of aparticular coordinate, correspond to particular“genes.” “Chromosomes” of these genes arerecombined randomly, and only the bestresults are kept. Random “mutations” areintroduced.

Page 13: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 13)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Deterministic Algorithms

• These either• involve using problem-specific structure to be assured

that computed points are global optimizers, or else• involve some kind of systematic global search over the

domain, generally branch and bound processes.

• The various branch and bound algorithms rely onestimates of ranges of the objective and constraintsover subdomains.

• The estimates on the range can be eithermathematically exact or heuristic.

• The mathematically exact estimates can be eitherrigorous (taking account of roundoff error) ornon-rigorous (using floating point arithmetic and otherapproximation algorithms without taking account ofnumerical errors).

Page 14: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 14)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Deterministic Algorithms

• These either• involve using problem-specific structure to be assured

that computed points are global optimizers, or else• involve some kind of systematic global search over the

domain, generally branch and bound processes.

• The various branch and bound algorithms rely onestimates of ranges of the objective and constraintsover subdomains.

• The estimates on the range can be eithermathematically exact or heuristic.

• The mathematically exact estimates can be eitherrigorous (taking account of roundoff error) ornon-rigorous (using floating point arithmetic and otherapproximation algorithms without taking account ofnumerical errors).

Page 15: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 15)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Deterministic Algorithms

• These either• involve using problem-specific structure to be assured

that computed points are global optimizers, or else• involve some kind of systematic global search over the

domain, generally branch and bound processes.

• The various branch and bound algorithms rely onestimates of ranges of the objective and constraintsover subdomains.

• The estimates on the range can be eithermathematically exact or heuristic.

• The mathematically exact estimates can be eitherrigorous (taking account of roundoff error) ornon-rigorous (using floating point arithmetic and otherapproximation algorithms without taking account ofnumerical errors).

Page 16: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 16)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Deterministic Algorithms

• These either• involve using problem-specific structure to be assured

that computed points are global optimizers, or else• involve some kind of systematic global search over the

domain, generally branch and bound processes.

• The various branch and bound algorithms rely onestimates of ranges of the objective and constraintsover subdomains.

• The estimates on the range can be eithermathematically exact or heuristic.

• The mathematically exact estimates can be eitherrigorous (taking account of roundoff error) ornon-rigorous (using floating point arithmetic and otherapproximation algorithms without taking account ofnumerical errors).

Page 17: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 17)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Deterministic Algorithms

• These either• involve using problem-specific structure to be assured

that computed points are global optimizers, or else• involve some kind of systematic global search over the

domain, generally branch and bound processes.

• The various branch and bound algorithms rely onestimates of ranges of the objective and constraintsover subdomains.

• The estimates on the range can be eithermathematically exact or heuristic.

• The mathematically exact estimates can be eitherrigorous (taking account of roundoff error) ornon-rigorous (using floating point arithmetic and otherapproximation algorithms without taking account ofnumerical errors).

Page 18: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 18)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Outline

1 IntroductionTheoretical and Practical ConsiderationsAlternate Approaches to Global OptimizationGeneral Branch and Bound Algorithms

2 Interval TechniquesBounding RangesProving Existence and UniquenessPracticalities

3 Use of Structure: An ExampleObtaining the Objective Lower BoundGradient-Based Box RejectionStructure in the Kuhn–Tucker Matrix

Page 19: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 19)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

The Abstract Algorithm

Initialization: Establish an upper bound ϕ on the globaloptimum over the feasible set.

Branching: Subdivide the initial region D into two or moresubregions D̃. (branching step, where webranch into subregions.)

Bounding: Bound the range of ϕ below over each D̃, toobtain

ϕ(D̃) ≤{

ϕ(x) | x ∈ D̃, c(x) = 0, g(x) ≤ 0.}

.

Fathoming: IF ϕ > ϕ THEN discard D̃,ELSE IF the diameter of D̃ is smaller than aspecified tolerance THEN put D̃ onto a list ofboxes containing possible global optimizers,ELSE Put D̃ is put onto a list for furtherbranching and bounding.END IF

Page 20: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 20)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Obtaining ϕ

• Some algorithms rely on Lipschitz constants to obtainϕ. These constants can be determined rigorously orheuristically.

• ϕ may also be obtained withoutwardly rounded interval arithmetic ornon-rigorous interval arithmetic, explained below.

• Recent algorithms (especially during the past decade)involve relaxations of the global optimization problemto obtain ϕ. For example, the relaxation can be alinear program, solvable by commercial technology.

Page 21: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 21)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Outline

1 IntroductionTheoretical and Practical ConsiderationsAlternate Approaches to Global OptimizationGeneral Branch and Bound Algorithms

2 Interval TechniquesBounding RangesProving Existence and UniquenessPracticalities

3 Use of Structure: An ExampleObtaining the Objective Lower BoundGradient-Based Box RejectionStructure in the Kuhn–Tucker Matrix

Page 22: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 22)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Bounding Ranges with IntervalArithmetic

• Each basic interval operation ¯ ∈ {+,−,×,÷, etc.} isdefined by

x ¯ y = {x ¯ y | x ∈ x and y ∈ y } .

• This definition can be made operational; for example,for x = [x , x ] and y = [y , y ], x + y = [x + y , x + y ];similarly, ranges of functions such as sin, exp can becomputed.

• Evaluation of an expression with this intervalarithmetic gives bounds on the range of theexpression.

• With directed rounding (e.g. using IEEE standardarithmetic), the computer can give mathematicallyrigorous bounds on ranges.

Page 23: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 23)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Bounding Ranges with IntervalArithmetic

• Each basic interval operation ¯ ∈ {+,−,×,÷, etc.} isdefined by

x ¯ y = {x ¯ y | x ∈ x and y ∈ y } .

• This definition can be made operational; for example,for x = [x , x ] and y = [y , y ], x + y = [x + y , x + y ];similarly, ranges of functions such as sin, exp can becomputed.

• Evaluation of an expression with this intervalarithmetic gives bounds on the range of theexpression.

• With directed rounding (e.g. using IEEE standardarithmetic), the computer can give mathematicallyrigorous bounds on ranges.

Page 24: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 24)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Bounding Ranges with IntervalArithmetic

• Each basic interval operation ¯ ∈ {+,−,×,÷, etc.} isdefined by

x ¯ y = {x ¯ y | x ∈ x and y ∈ y } .

• This definition can be made operational; for example,for x = [x , x ] and y = [y , y ], x + y = [x + y , x + y ];similarly, ranges of functions such as sin, exp can becomputed.

• Evaluation of an expression with this intervalarithmetic gives bounds on the range of theexpression.

• With directed rounding (e.g. using IEEE standardarithmetic), the computer can give mathematicallyrigorous bounds on ranges.

Page 25: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 25)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Bounding Ranges with IntervalArithmetic

• Each basic interval operation ¯ ∈ {+,−,×,÷, etc.} isdefined by

x ¯ y = {x ¯ y | x ∈ x and y ∈ y } .

• This definition can be made operational; for example,for x = [x , x ] and y = [y , y ], x + y = [x + y , x + y ];similarly, ranges of functions such as sin, exp can becomputed.

• Evaluation of an expression with this intervalarithmetic gives bounds on the range of theexpression.

• With directed rounding (e.g. using IEEE standardarithmetic), the computer can give mathematicallyrigorous bounds on ranges.

Page 26: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 26)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Outline

1 IntroductionTheoretical and Practical ConsiderationsAlternate Approaches to Global OptimizationGeneral Branch and Bound Algorithms

2 Interval TechniquesBounding RangesProving Existence and UniquenessPracticalities

3 Use of Structure: An ExampleObtaining the Objective Lower BoundGradient-Based Box RejectionStructure in the Kuhn–Tucker Matrix

Page 27: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 27)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Proving Existence andUniqueness

Interval Newton Methods

For the system f (x) = 0, f : Rn → Rn, an interval Newtonmethod is an operator N : x → x̃ :

x̃ = N(f ; x , x̌) = x̌ + v , where Σ(A,−f (x̌)

) ⊂ v ,

where A is a Lipschitz matrix for f over x and Σ(A,−f (x̌)

)is that set {x ∈ Rn} such that there exists an A ∈ A withAx = −f (x̌).

Theorem

Suppose x̃ is the image of x under an interval Newtonmethod. If x̃ ⊆ x , it follows that there exists a uniquesolution of f (x) = 0 within x .

(See A. Neumaier, Interval Methods for Systems ofEquations, Cambridge University Press, 1990.)

Page 28: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 28)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Interval Newton MethodsA Practical Summary

• N(f ; x , x̌) can be computed similarly to a classicalpoint multivariate Newton step, but with intervalarithmetic.

• Any solutions of f (x) = 0 in x must be in N(f ; x , x̌).(Hence, an interval Newton can be used to reduce thevolume of x , an acceleration procedure.)

• N(f ; x , x̌) ⊂ x implies there is a unique solution tof (x) = 0 in N(f ; x , x̌), and hence in x .

• Another consequence: N(f ; x , x̌) ∩ x = ∅ implies x isfathomed, and x may be discarded.

• Interval Newton methods are locally quadraticallyconvergent.

Page 29: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 29)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Linear RelaxationsA Third “Interval" Technique

• If we replace ϕ by ϕl , ϕl ≤ ϕ on the feasible set, wereplace each equality constraint by a pair of inequalityconstraints, and we replace each inequality constraintg i ≤ 0 by g i,l ≤ 0, g i,l ≤ gi for x ∈ x . The resultingrelaxation has optimum less than or equal to theoptimum of the original problem.

• If the relaxation is linear, LP technology can be used,to obtain ϕ and, with a different objective, narrowerbounds on the coordinates.

• Due to a technique proposed by Neumaier andShcherbina (and independently by Janssen), thebounds computed by the approximate LP solver canbe made rigorous by evaluating a simple expressioninvolving the duality gap with interval arithmetic.Infeasibility can also be similarly detected.

Page 30: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 30)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Linear RelaxationsA Third “Interval" Technique

• If we replace ϕ by ϕl , ϕl ≤ ϕ on the feasible set, wereplace each equality constraint by a pair of inequalityconstraints, and we replace each inequality constraintg i ≤ 0 by g i,l ≤ 0, g i,l ≤ gi for x ∈ x . The resultingrelaxation has optimum less than or equal to theoptimum of the original problem.

• If the relaxation is linear, LP technology can be used,to obtain ϕ and, with a different objective, narrowerbounds on the coordinates.

• Due to a technique proposed by Neumaier andShcherbina (and independently by Janssen), thebounds computed by the approximate LP solver canbe made rigorous by evaluating a simple expressioninvolving the duality gap with interval arithmetic.Infeasibility can also be similarly detected.

Page 31: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 31)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Linear RelaxationsA Third “Interval" Technique

• If we replace ϕ by ϕl , ϕl ≤ ϕ on the feasible set, wereplace each equality constraint by a pair of inequalityconstraints, and we replace each inequality constraintg i ≤ 0 by g i,l ≤ 0, g i,l ≤ gi for x ∈ x . The resultingrelaxation has optimum less than or equal to theoptimum of the original problem.

• If the relaxation is linear, LP technology can be used,to obtain ϕ and, with a different objective, narrowerbounds on the coordinates.

• Due to a technique proposed by Neumaier andShcherbina (and independently by Janssen), thebounds computed by the approximate LP solver canbe made rigorous by evaluating a simple expressioninvolving the duality gap with interval arithmetic.Infeasibility can also be similarly detected.

Page 32: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 32)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Constraint Propagation

In simple terms, constraint propagation is based on

• solving a relation for a selected variable in terms of theother variables;

• using the bounds on the other variables (e.g. byevaluating the expression with interval arithmetic) todetermine new bounds for the selected variable;

• iterating the process (in some order and with somestopping criteria) to narrow bounds on as manyvariables as is practical.

Page 33: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 33)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Constraint Propagation

In simple terms, constraint propagation is based on

• solving a relation for a selected variable in terms of theother variables;

• using the bounds on the other variables (e.g. byevaluating the expression with interval arithmetic) todetermine new bounds for the selected variable;

• iterating the process (in some order and with somestopping criteria) to narrow bounds on as manyvariables as is practical.

Page 34: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 34)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Constraint Propagation

In simple terms, constraint propagation is based on

• solving a relation for a selected variable in terms of theother variables;

• using the bounds on the other variables (e.g. byevaluating the expression with interval arithmetic) todetermine new bounds for the selected variable;

• iterating the process (in some order and with somestopping criteria) to narrow bounds on as manyvariables as is practical.

Page 35: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 35)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Outline

1 IntroductionTheoretical and Practical ConsiderationsAlternate Approaches to Global OptimizationGeneral Branch and Bound Algorithms

2 Interval TechniquesBounding RangesProving Existence and UniquenessPracticalities

3 Use of Structure: An ExampleObtaining the Objective Lower BoundGradient-Based Box RejectionStructure in the Kuhn–Tucker Matrix

Page 36: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 36)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Use of Interval TechniquesPitfalls

• Interval Dependency typically leads to overestimationin the ranges of ϕ and the constraints, if intervalevaluation is applied naively.

• For a given width of x , the width of N(f ; x , x̌) isproportional to the condition number of the Jacobimatrix of f , thus limiting the usefulness of intervalNewton methods.

• Constraint propagation is analogous to a fixed pointiteration that only converges when the iteration is acontraction.

• Linear relaxations can suffer from use of a poor LPsolver; they also depend on which relaxation is used.

Page 37: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 37)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Use of Interval TechniquesPitfalls

• Interval Dependency typically leads to overestimationin the ranges of ϕ and the constraints, if intervalevaluation is applied naively.

• For a given width of x , the width of N(f ; x , x̌) isproportional to the condition number of the Jacobimatrix of f , thus limiting the usefulness of intervalNewton methods.

• Constraint propagation is analogous to a fixed pointiteration that only converges when the iteration is acontraction.

• Linear relaxations can suffer from use of a poor LPsolver; they also depend on which relaxation is used.

Page 38: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 38)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Use of Interval TechniquesPitfalls

• Interval Dependency typically leads to overestimationin the ranges of ϕ and the constraints, if intervalevaluation is applied naively.

• For a given width of x , the width of N(f ; x , x̌) isproportional to the condition number of the Jacobimatrix of f , thus limiting the usefulness of intervalNewton methods.

• Constraint propagation is analogous to a fixed pointiteration that only converges when the iteration is acontraction.

• Linear relaxations can suffer from use of a poor LPsolver; they also depend on which relaxation is used.

Page 39: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 39)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Use of Interval TechniquesPitfalls

• Interval Dependency typically leads to overestimationin the ranges of ϕ and the constraints, if intervalevaluation is applied naively.

• For a given width of x , the width of N(f ; x , x̌) isproportional to the condition number of the Jacobimatrix of f , thus limiting the usefulness of intervalNewton methods.

• Constraint propagation is analogous to a fixed pointiteration that only converges when the iteration is acontraction.

• Linear relaxations can suffer from use of a poor LPsolver; they also depend on which relaxation is used.

Page 40: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 40)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Use of Interval TechniquesAvoiding Pitfalls

• Design Algorithms with non-interval computations toget approximations, then use interval computationonly a posteriori to obtain rigorous bounds.

• Arrange expressions carefully to avoid overestimationin interval computations.

• Pose nonlinear systems carefully, taking advantage ofproblem structure, to avoid ill-conditioning in intervalNewton methods.

• Choose implementation details in constraintpropagation carefully, according to experience.

• Design stopping criteria for each of these processesrationally.

• Use experience to choose the best relaxations.• Use experience to develop heuristics on when and

how often to apply individual algorithm components.

Page 41: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 41)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Use of Interval TechniquesAvoiding Pitfalls

• Design Algorithms with non-interval computations toget approximations, then use interval computationonly a posteriori to obtain rigorous bounds.

• Arrange expressions carefully to avoid overestimationin interval computations.

• Pose nonlinear systems carefully, taking advantage ofproblem structure, to avoid ill-conditioning in intervalNewton methods.

• Choose implementation details in constraintpropagation carefully, according to experience.

• Design stopping criteria for each of these processesrationally.

• Use experience to choose the best relaxations.• Use experience to develop heuristics on when and

how often to apply individual algorithm components.

Page 42: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 42)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Use of Interval TechniquesAvoiding Pitfalls

• Design Algorithms with non-interval computations toget approximations, then use interval computationonly a posteriori to obtain rigorous bounds.

• Arrange expressions carefully to avoid overestimationin interval computations.

• Pose nonlinear systems carefully, taking advantage ofproblem structure, to avoid ill-conditioning in intervalNewton methods.

• Choose implementation details in constraintpropagation carefully, according to experience.

• Design stopping criteria for each of these processesrationally.

• Use experience to choose the best relaxations.• Use experience to develop heuristics on when and

how often to apply individual algorithm components.

Page 43: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 43)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Use of Interval TechniquesAvoiding Pitfalls

• Design Algorithms with non-interval computations toget approximations, then use interval computationonly a posteriori to obtain rigorous bounds.

• Arrange expressions carefully to avoid overestimationin interval computations.

• Pose nonlinear systems carefully, taking advantage ofproblem structure, to avoid ill-conditioning in intervalNewton methods.

• Choose implementation details in constraintpropagation carefully, according to experience.

• Design stopping criteria for each of these processesrationally.

• Use experience to choose the best relaxations.• Use experience to develop heuristics on when and

how often to apply individual algorithm components.

Page 44: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 44)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Use of Interval TechniquesAvoiding Pitfalls

• Design Algorithms with non-interval computations toget approximations, then use interval computationonly a posteriori to obtain rigorous bounds.

• Arrange expressions carefully to avoid overestimationin interval computations.

• Pose nonlinear systems carefully, taking advantage ofproblem structure, to avoid ill-conditioning in intervalNewton methods.

• Choose implementation details in constraintpropagation carefully, according to experience.

• Design stopping criteria for each of these processesrationally.

• Use experience to choose the best relaxations.• Use experience to develop heuristics on when and

how often to apply individual algorithm components.

Page 45: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 45)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Use of Interval TechniquesAvoiding Pitfalls

• Design Algorithms with non-interval computations toget approximations, then use interval computationonly a posteriori to obtain rigorous bounds.

• Arrange expressions carefully to avoid overestimationin interval computations.

• Pose nonlinear systems carefully, taking advantage ofproblem structure, to avoid ill-conditioning in intervalNewton methods.

• Choose implementation details in constraintpropagation carefully, according to experience.

• Design stopping criteria for each of these processesrationally.

• Use experience to choose the best relaxations.• Use experience to develop heuristics on when and

how often to apply individual algorithm components.

Page 46: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 46)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Use of Interval TechniquesAvoiding Pitfalls

• Design Algorithms with non-interval computations toget approximations, then use interval computationonly a posteriori to obtain rigorous bounds.

• Arrange expressions carefully to avoid overestimationin interval computations.

• Pose nonlinear systems carefully, taking advantage ofproblem structure, to avoid ill-conditioning in intervalNewton methods.

• Choose implementation details in constraintpropagation carefully, according to experience.

• Design stopping criteria for each of these processesrationally.

• Use experience to choose the best relaxations.• Use experience to develop heuristics on when and

how often to apply individual algorithm components.

Page 47: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 47)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

An Algorithm for VerifiedUnconstrained Minimax

Ph.D. Work of Siripawn Winter

We examine the special global optimization problem wherethe objective is

ϕ(x) = max1≤i≤m

fi(x) or ϕ(x) = max1≤i≤m

|fi(x)|,

where the fi : x → R, x ⊂ IRn and the fi are C2 functions.

• The objective is non-differentiable.

• If Lemaréchal’s technique is used to convert thesystem to a constrained system, the resultingKuhn–Tucker equations are not well-suited to intervalNewton methods.

• We present some simple ways in which the structurecan be used to advantage.

Page 48: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 48)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Outline

1 IntroductionTheoretical and Practical ConsiderationsAlternate Approaches to Global OptimizationGeneral Branch and Bound Algorithms

2 Interval TechniquesBounding RangesProving Existence and UniquenessPracticalities

3 Use of Structure: An ExampleObtaining the Objective Lower BoundGradient-Based Box RejectionStructure in the Kuhn–Tucker Matrix

Page 49: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 49)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Obtaining ϕ

A rigorous lower bound on the objective is obtained as

ϕ(x ) = maxi{migx∈x (fi(x))},

as is illustrated here for n = 1, m = 2. (If the problem is tominimize the maximum over i of |fi |, then we replace fi by|fi | in the above.)

b

|f2| = mig(f2)

|f1| = mig(f1) = ϕ(x)

x x0

1

Page 50: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 50)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Outline

1 IntroductionTheoretical and Practical ConsiderationsAlternate Approaches to Global OptimizationGeneral Branch and Bound Algorithms

2 Interval TechniquesBounding RangesProving Existence and UniquenessPracticalities

3 Use of Structure: An ExampleObtaining the Objective Lower BoundGradient-Based Box RejectionStructure in the Kuhn–Tucker Matrix

Page 51: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 51)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

A Gradient-Based EliminationProcess

The basic idea is:• Begin with a point x̌ ∈ x .• Construct a box x r with one corner at x̌ , by

proceeding out from x̌ in coordinate directions inwhich the function fi with

fi(x̌) = max1≤j≤m

fj(x̌)

(the dominant function) is increasing.• Define x r with a point x̂ diametrically opposite x̌ , such

that we are assured that fi(x) ≥ fi(x̌) for every x ∈ x r .• Use a width-scaled version of the complementation

process in §4.3.1 of Rigorous Global Search:Continuous Problems (Kluwer, 1996) to remove x r

from the search region.

Page 52: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 52)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

A Gradient-Based EliminationProcess

The basic idea is:• Begin with a point x̌ ∈ x .• Construct a box x r with one corner at x̌ , by

proceeding out from x̌ in coordinate directions inwhich the function fi with

fi(x̌) = max1≤j≤m

fj(x̌)

(the dominant function) is increasing.• Define x r with a point x̂ diametrically opposite x̌ , such

that we are assured that fi(x) ≥ fi(x̌) for every x ∈ x r .• Use a width-scaled version of the complementation

process in §4.3.1 of Rigorous Global Search:Continuous Problems (Kluwer, 1996) to remove x r

from the search region.

Page 53: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 53)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

A Gradient-Based EliminationProcess

The basic idea is:• Begin with a point x̌ ∈ x .• Construct a box x r with one corner at x̌ , by

proceeding out from x̌ in coordinate directions inwhich the function fi with

fi(x̌) = max1≤j≤m

fj(x̌)

(the dominant function) is increasing.• Define x r with a point x̂ diametrically opposite x̌ , such

that we are assured that fi(x) ≥ fi(x̌) for every x ∈ x r .• Use a width-scaled version of the complementation

process in §4.3.1 of Rigorous Global Search:Continuous Problems (Kluwer, 1996) to remove x r

from the search region.

Page 54: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 54)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

A Gradient-Based EliminationProcess

The basic idea is:• Begin with a point x̌ ∈ x .• Construct a box x r with one corner at x̌ , by

proceeding out from x̌ in coordinate directions inwhich the function fi with

fi(x̌) = max1≤j≤m

fj(x̌)

(the dominant function) is increasing.• Define x r with a point x̂ diametrically opposite x̌ , such

that we are assured that fi(x) ≥ fi(x̌) for every x ∈ x r .• Use a width-scaled version of the complementation

process in §4.3.1 of Rigorous Global Search:Continuous Problems (Kluwer, 1996) to remove x r

from the search region.

Page 55: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 55)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

A Gradient-Based EliminationProcess

Some Implementation Details

• The direction in coordinate k to proceed from x̌ ,1 ≤ j ≤ n is chosen according to the sign of ∂fi/∂xk .

• The distances rk that we go in direction k aredetermined simultaneously by solving a linearminimax problem based on the degree-2 Taylorpolynomial of fi centered at x̌ . In particular, wemaximize the minimum rk for which the quadraticmodel predicts that ∂fi/∂xk (x̂) = 0.

• Once x r is constructed, it is verified with intervalarithmetic that ϕ(x) ≥ ϕ(x̌).

Page 56: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 56)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

A Gradient-Based EliminationProcess

Some Implementation Details

• The direction in coordinate k to proceed from x̌ ,1 ≤ j ≤ n is chosen according to the sign of ∂fi/∂xk .

• The distances rk that we go in direction k aredetermined simultaneously by solving a linearminimax problem based on the degree-2 Taylorpolynomial of fi centered at x̌ . In particular, wemaximize the minimum rk for which the quadraticmodel predicts that ∂fi/∂xk (x̂) = 0.

• Once x r is constructed, it is verified with intervalarithmetic that ϕ(x) ≥ ϕ(x̌).

Page 57: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 57)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

A Gradient-Based EliminationProcess

Some Implementation Details

• The direction in coordinate k to proceed from x̌ ,1 ≤ j ≤ n is chosen according to the sign of ∂fi/∂xk .

• The distances rk that we go in direction k aredetermined simultaneously by solving a linearminimax problem based on the degree-2 Taylorpolynomial of fi centered at x̌ . In particular, wemaximize the minimum rk for which the quadraticmodel predicts that ∂fi/∂xk (x̂) = 0.

• Once x r is constructed, it is verified with intervalarithmetic that ϕ(x) ≥ ϕ(x̌).

Page 58: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 58)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

A Gradient-Based EliminationProcess

Illustration

x1 x1

x2

x2

(x̌1, x̌2)

r2

r1

b

b

1

Page 59: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 59)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Additional Techniques andDetails

• We also expand x r in the opposite direction (on theother side of x̌ to the extent possible.

• We compute approximate optima at various points inthe algorithm, to get better ϕ and to eliminate boxesaround such local optima.

• The algorithm was implemented in Matlab usingroutines from the optimization toolbox for localapproximate solutions to the minimax problem and forsolutions of the linear minimax problems.

Page 60: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 60)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Additional Techniques andDetails

• We also expand x r in the opposite direction (on theother side of x̌ to the extent possible.

• We compute approximate optima at various points inthe algorithm, to get better ϕ and to eliminate boxesaround such local optima.

• The algorithm was implemented in Matlab usingroutines from the optimization toolbox for localapproximate solutions to the minimax problem and forsolutions of the linear minimax problems.

Page 61: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 61)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Additional Techniques andDetails

• We also expand x r in the opposite direction (on theother side of x̌ to the extent possible.

• We compute approximate optima at various points inthe algorithm, to get better ϕ and to eliminate boxesaround such local optima.

• The algorithm was implemented in Matlab usingroutines from the optimization toolbox for localapproximate solutions to the minimax problem and forsolutions of the linear minimax problems.

Page 62: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 62)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

A Gradient-Based EliminationProcess

Illustration of Algorithm Performance

We consider problem “OET5":

minx

max1≤i≤21

|fi(x)|,

fi(x) = x4 − (x1t2i + x2ti + x3)

2 −√

ti ,

ti = 0.25 + 0.75(i − 1)/20.

For illustration purposes, we fixed x3 and x4 at the valuesat an approximate solution. On the next illustration,• red boxes were rejected by comparing ϕ to ϕ,• yellow boxes were rejected from the gradient-based

elimination process,• green boxes are constructed about approximate

optima.

Page 63: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 63)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

The Branch and BoundProcess for OET5

−2 −1.5 −1 −0.5 0 0.5 1 1.5 2−2

−1.5

−1

−0.5

0

0.5

1

1.5

2

x1

x 2

Removing boxes Iteration:0

xs

−2 −1.5 −1 −0.5 0 0.5 1 1.5 2−2

−1.5

−1

−0.5

0

0.5

1

1.5

2

x1

x 2

Removing boxes Iteration:1

xsxs

xr

−2 −1.5 −1 −0.5 0 0.5 1 1.5 2−2

−1.5

−1

−0.5

0

0.5

1

1.5

2

x1

x 2

Removing boxes Iteration:2

xsxs

xr

xr

xs

xr

xr

−2 −1.5 −1 −0.5 0 0.5 1 1.5 2−2

−1.5

−1

−0.5

0

0.5

1

1.5

2

x1

x 2

Removing boxes Iteration:3

xsxs

xr

xr

xr xs

xr

xr

xr xs

xr

xr

xr

−2 −1.5 −1 −0.5 0 0.5 1 1.5 2−2

−1.5

−1

−0.5

0

0.5

1

1.5

2

x1

x 2

Removing boxes Iteration:4

xsxs

xr

xr

xr

xr

xs

xr

xr

xr

xr

xs

xr

xr

xr

xr

xs

xr

xr

xr

xr

−2 −1.5 −1 −0.5 0 0.5 1 1.5 2−2

−1.5

−1

−0.5

0

0.5

1

1.5

2

x1

x 2

Removing boxes Iteration:5

xsxs

xr

xr

xr

xr

xs

xr

xr

xr

xr

xs

xr

xr

xr

xr

xs

xr

xr

xr

xr

xs

xr

xr

xr

xr xrxr

xr

xr

xr

xs

Page 64: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 64)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Outline

1 IntroductionTheoretical and Practical ConsiderationsAlternate Approaches to Global OptimizationGeneral Branch and Bound Algorithms

2 Interval TechniquesBounding RangesProving Existence and UniquenessPracticalities

3 Use of Structure: An ExampleObtaining the Objective Lower BoundGradient-Based Box RejectionStructure in the Kuhn–Tucker Matrix

Page 65: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 65)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Kuhn–Tucker versus Fritz John

• The normalized Fritz John equations have a-prioribounds on the multipliers, whereas the Kuhn–Tuckerequations do not; the a priori bounds are anadvantage in interval Newton methods inautomatically verified computations.

• On the other hand, the Fritz John equations tend topresent numerical difficulties in practice that theKuhn–Tucker equations do not.

• We describe a relationship between these twosystems for minimax problems.

Page 66: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 66)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Kuhn–Tucker versus Fritz John

• The normalized Fritz John equations have a-prioribounds on the multipliers, whereas the Kuhn–Tuckerequations do not; the a priori bounds are anadvantage in interval Newton methods inautomatically verified computations.

• On the other hand, the Fritz John equations tend topresent numerical difficulties in practice that theKuhn–Tucker equations do not.

• We describe a relationship between these twosystems for minimax problems.

Page 67: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 67)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Kuhn–Tucker versus Fritz John

• The normalized Fritz John equations have a-prioribounds on the multipliers, whereas the Kuhn–Tuckerequations do not; the a priori bounds are anadvantage in interval Newton methods inautomatically verified computations.

• On the other hand, the Fritz John equations tend topresent numerical difficulties in practice that theKuhn–Tucker equations do not.

• We describe a relationship between these twosystems for minimax problems.

Page 68: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 68)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Lemaréchal Reformulation

minimize x n+1

subject to pi = fi(x 1, . . . , x n)− x n+1 ≤ 0, i = 1, . . . , m,pi+m − fi(x 1, . . . , x n)− x n+1 ≤ 0, i = 1, . . . , m.

has normalized Fritz John system

F (x , u) =

u0∇ϕ(x) +2m∑

j=1

uj∇pj

u1p1...

u2mp2m

u0 +2m∑

j=1

uj − 1

= 0,

where 0 ≤ ui ≤ 1, i = 0, 1, . . . , 2m

Page 69: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 69)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Fritz John and Kuhn–TuckerEquivalence

Symbolic manipulation of the Fritz John system gives

F (X ) =

0.5∇ϕ(x) +2m∑

j=1

uj∇pj

u1p1...

u2mp2m

= 0.

Thus, the Fritz John system is equivalent to theKuhn–Tucker system with Kuhn–Tucker multipliers (i.e.Lagrange multipliers) double the Fritz John multipliers.

Page 70: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 70)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix

Prospects for Interval NewtonMethods in Minimax Problems

• To date, we have been unable to effectively exploitinterval Newton methods as filters to reduce the sizeof boxes.

• However, we have used interval Newton methodseffectively to prove existence and uniqueness ofsolutions in boxes constructed around approximatesolutions.

Page 71: An Introduction and Example of Intervalneum/glopt/gicolag/talks/...local optimum. (See J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear

(slide 71)

IntroductionTheoretical andPractical Considerations

Alternate Approaches toGlobal Optimization

General Branch andBound Algorithms

IntervalTechniquesBounding Ranges

Proving Existence andUniqueness

Practicalities

Use ofStructure: AnExampleObtaining the ObjectiveLower Bound

Gradient-Based BoxRejection

Structure in theKuhn–Tucker Matrix