outline

26
Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy’s National Nuclear Security Administration under Contract DE-AC04-94AL85000 Formulations for Surrogate-Based Optimization Using Data Fit and Multifidelity Models Daniel M. Dunlavy and Michael S. Eldred Sandia National Laboratories, Albuquerque, NM SIAM Conference on Computation Science and Engineering February 19-23, 2007 SAND2007-0813C

Upload: ordell

Post on 21-Mar-2016

27 views

Category:

Documents


0 download

DESCRIPTION

Formulations for Surrogate-Based Optimization Using Data Fit and Multifidelity Models Daniel M. Dunlavy and Michael S. Eldred Sandia National Laboratories, Albuquerque, NM SIAM Conference on Computation Science and Engineering February 19-23, 2007 SAND2007-0813C. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Outline

Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United StatesDepartment of Energy’s National Nuclear Security Administration under Contract DE-AC04-94AL85000

Formulations for Surrogate-Based Optimization

Using Data Fit and Multifidelity Models

Daniel M. Dunlavy and Michael S. EldredSandia National Laboratories, Albuquerque, NM

SIAM Conference on Computation Science and EngineeringFebruary 19-23, 2007

SAND2007-0813C

Page 2: Outline

Introduction to Surrogate-Based Optimization– Data fit case– Model hierarchy case

Algorithmic Components– Approximate subproblem– Iterate acceptance logic– Merit functions– Constraint relaxation– Corrections and convergence

Computational Experiments– Barnes– CUTE– MEMS design

Outline

Page 3: Outline

Introduction to Surrogate-Based Optimization (SBO)

• Purpose: Reduce the number of expensive, high-fidelity simulations• Use succession of approximate (surrogate) models

• Drawback: Approximations generally have a limited range of validity• Trust regions adaptively manage this range based on efficacy of

approximations throughout optimization run

• Benefit: SBO algorithms can be provably-convergent • E.g., using trust region globalization and local 1st-order consistency

• Surrogate Models of Interest• Data fits (e.g., local, multipoint, global)• Multifidelity (e.g., multigrid optimization)• Reduced-order models (e.g., spectral decomposition)

Page 4: Outline

Trust Region SBO – Data Fit CaseData fit surrogates:• Global

– Polynomial response surfaces– Artificial neural networks– Splines– Kriging models– Gaussian processes

• Local– 1st/2nd-order Taylor series

• Multipoint: – Two-Point Exponential Approx. (TPEA)– Two-point Adaptive Nonlinearity Approx. (TANA)

Sequence of trust regions

Neural Network Kriging SplinesQuad Poly

f

x1

x2

• Smoothing• O(101) variables• Local consistency

Page 5: Outline

Trust Region SBO – Multifidelity Case

Multifidelity surrogates:– Loosened solver tolerances– Coarsened discretizations: e.g, mesh size– Omitted physics: e.g., Euler CFD, panel methods

Multifidelity models in SBO:– Truth evaluation only at center of trust region

• Smoothing character is lost requires well-behaved low-fidelity model

– Correction quality is crucial• Additive: f(x)HF = f(x)LF + (x)

• Multiplicative: f(x)HF = f(x)LF (x)• Combined (convex combination of add/mult)

– May require design vector mapping• Space mapping• Corrected space mapping• POD mapping• Hybrid POD space mapping

Sequence of trust regions

Page 6: Outline

Intro to Surrogate-Based Optimization– Data fit case– Model hierarchy case

Algorithmic Components– Approximate subproblem– Iterate acceptance logic– Merit functions– Constraint relaxation– Corrections and convergence

Computational Experiments– Barnes– CUTE– MEMS design

Outline

Page 7: Outline

SBO Algorithm Components:Approximate Subproblem Formulations

(TRAL)

(IPTRSAO)

(SQP-like)(Direct surrogate)

Page 8: Outline

SBO Algorithm Components:Iterate Acceptance Logic

Trust region ratio• Ratio of actual to predicted improvement

Filter method• Uses concept of Pareto optimality applied

to objective and constraint violation• Still need logic to adapt TR size for

accepted steps

Page 9: Outline

SBO Algorithm Components:Merit function selections

Penalty:

Adaptive penalty:adapts rp using monotonic increases in the iteration offset value in order to accept any iterate that reduces the constraint violation mimics a (monotonic) filter

Lagrangian:

Augmented Lagrangian:

Multiplierestimation:

Multiplierestimation:

(derived from elimination of slacks)

(drives Lagrangian gradient to KKT)

Page 10: Outline

SBO Algorithm Components:Constraint Relaxation

Composite step:• Byrd-Omojokun• Celis-Dennis-Tapia• MAESTROHomotopy:• Heuristic (IPTRSAO-like)• Probability one

Relax nonlinear constraintsApproximate subproblem Relaxed subproblem

Estimate :

Leave portion of TR volume feasiblefor relaxed subproblem

Page 11: Outline

Intro to Surrogate-Based Optimization– Data fit case– Model hierarchy case

Algorithmic Components– Approximate subproblem– Iterate acceptance logic– Merit functions– Constraint relaxation– Corrections and convergence

Computational Experiments– Barnes– CUTE– MEMS design

Outline

Page 12: Outline

Data Fit SBO Experiment #1Barnes Problem: Algorithm Component Test Matrices

Barnes Function

-120 -100 -80 -60 -40 -20 0 20 40

Surrogates• Linear polynomial, quadratic polynomial,

1st-order Taylor series, TANA

Approximate subproblem• Function: f(x), Lagrangian, augmented Lagrangian• Constraints: g(x)/h(x), linearized, none

Iterate Acceptance Logic• Trust region ratio, filter

Merit function• Penalty, adaptive penalty, Lagrangian, augmented

Lagrangian

Constraint relaxation• Heuristic homotopy, none

Page 13: Outline

Data Fit SBO Experiment #2Barnes Problem: Effect of Constraint Relaxation

Sample homotopy constraint relaxation 106 starting points on uniform grid

Optimal/feasible directions differ by < 90o.Relaxation beneficial to achieve balance.

Optimal/feasible directions opposed.Relaxation harmful since feasibility must ultimately take precedence.

Page 14: Outline

Data Fit SBO Experiment #3Selected CUTE Problems: Test Matrix Summaries

Problem Description #Var.#Ineq. Constr.

#Eq. Constr.

AVGASA Academic problem 6 6 0

CSFI1 Continuous caster design 5 2 2

CSFI2 Continuous caster design 5 2 2

FCCU Fluid catalytic cracker modeling

19 0 8

HEART8 Dipole model of the heart 8 0 8

HIMMELBK Nonlinear blending 24 14 0

HS073 Optimal cattle feeding 4 2 1

HS087 Electrical networks 9 0 4

HS093 Transformer design 6 2 0

HS100 Academic problem 7 4 0

HS114 Alkylation 10 8 3

HS116 Membrane separation 13 15 0

LEWISPOL Number theory 6 0 9

PENTAGON Applied geometry 6 12 0

Surrogates• TANA better than local

Approximate subproblem• Function: f(x)• Constraints: g(x)/h(x)

Iterate Acceptance Logic• TR ratio varies considerably• Filter more consistent/robust

Merit function• No clear preference

Constraint relaxation• No clear preference

Page 15: Outline

Multifidelity SBO Testing:Bi-Stable MEMS Switch

13 design vars:Wi, Li, i

38.6x

Page 16: Outline

MEMS Design Results:Single-Fidelity and Multifidelity

NPSOL DOT

Current best SBO approaches carried forward:• Approximate subproblem: original function, original constraints• Augmented Lagrangian merit function• Trust region ratio acceptance

Note: Both single-fidelity runs fail with forward differences

Page 17: Outline

Concluding Remarks

Data Fit SBO Performance Comparison:• Primary conclusions

• Approx. subproblem: Direct surrogate (original obj., original constr.)• Iterate acceptance: TR ratio more variable, filter more robust

• Secondary conclusions• Merit function: Augmented Lagrangian, Adaptive Penalty• Constraint relaxation: homotopy, adaptive logic needed

Multifidelity SBO Results:• Robustness: SBO less sensitive to gradient accuracy• Expense: Equivalent high-fidelity work reduced by ~3x

Future Work:• Penalty-free, multiplier-free approaches (filter-based merit fn/TR resizing)• Constraint relaxation linking optimal/feasible directions to • Investigation of SBO performance for alternate subproblem formulations

Page 18: Outline

DAKOTA

Capabilities available in DAKOTA v4.0+:• Available for download:

http://www.cs.sandia.gov/DAKOTA

Paper containing more details:• Available for download:

http://www.cs.sandia.gov/~dmdunla/publications/Eldred_FSB_2006.pdf

Questions/comments:• Email: [email protected]

Page 19: Outline

Extra Slides

Page 20: Outline

Data Fit SBO Experiment #1Barnes Problem: Algorithm Component Test Matrices

Starting point (30,40)

Starting point (65,1)

Starting point (10,20)

Page 21: Outline

Data Fit SBO Experiment #1Barnes Problem: Algorithm Component Test Matrices

Starting point (30,40)

Starting point (65,1)

Starting point (10,20)

Page 22: Outline

Data Fit SBO Experiment #3Selected CUTE Problems: Test Matrix Summaries

3 top performers for each problem: SP Obj–SP CON—Surr—Merit—Accept—Con Relax

Page 23: Outline

SBO Algorithm Components:Corrections and Convergence

• Corrections:• Additive• Multiplicative• Combined

• Zeroth-order• First-order• Second-order (full, FD, Quasi)

• Soft convergence assessment: • Diminishing returns

• Hard convergence assessment: • Bound constrained-LLS for updating multipliers for standard Lagrangian

minimizes the KKT residual

• If feasible and KKT residual < tolerance, then hard convergence achieved

Page 24: Outline

Local Correction Derivations

Exact additive/multiplicative:

Then:

where:

Approximate A,B with 2nd-order Taylor series centered at xc:

Page 25: Outline

Multipoint Correction Derivation

Combined additive/multiplicative (notional):

Assume a weighted sum of , corrections which preserves consistency:

Enforce additional matching condition: where xp Previous xc

Rejected x*

Can be used to preserve global accuracy while satisfying local consistency

Page 26: Outline

Iterator

Model

Strategy: control of multiple iterators and models

Iterator

Model

Iterator

Model

Coordination:NestedLayeredCascadedConcurrentAdaptive/Interactive

Parallelism:Asynchronous localMessage passingHybrid4 nested levels with Master-slave/dynamic Peer/static

DAKOTA Framework

Parameters

Model:

Designcontinuousdiscrete

Uncertainnormal/lognuniform/logutriangularbeta/gammaEV I, II, IIIhistograminterval

Statecontinuousdiscrete

Applicationsystemforkdirectgrid

Approximationglobal

polynomial 1/2/3, NN,kriging, MARS, RBF

multipoint – TANA3local – Taylor serieshierarchical

Functionsobjectivesconstraintsleast sq. termsgeneric

ResponsesInterfaceParameters

Hybrid

SurrBasedOptUnderUnc

Branch&Bound/PICO

Strategy

Optimization Uncertainty

2ndOrderProb

UncOfOptima

LHS/MC

Iterator

Optimizer ParamStudy

COLINYNPSOLDOT OPT++

LeastSqDoEGN

Vector

MultiDList

DDACE CCD/BB

UQ

Reliability

DSTE

JEGA

Pareto/MStart

CONMIN

NLSSOLNL2SOLQMC/CVT

Gradientsnumericalanalytic

HessiansnumericalanalyticquasiNLPQL

CenterSFEM