survey optimization methods
TRANSCRIPT
-
8/4/2019 Survey Optimization Methods
1/61
Survey of optimization methods for
local and global searchRaphael HaftkaDCMM Summer course
July 13, 2006
Unconstrained optimization
Direct constrained optimization
Constrained optimization transformed to
unconstrained
-
8/4/2019 Survey Optimization Methods
2/61
Unconstrained optimization
Optimality conditions for local and global
optimization Survey of algorithms described in Chapter
4
X=FMINUNC(FUN,X0,OPTIONS) minimizeswith the default optimization parameters
replaced by values in the structureOPTIONS, an argument created with theOPTIMSET function.
-
8/4/2019 Survey Optimization Methods
3/61
Optimality conditions
To minimize a necessary condition is
Sufficient condition for a minimum is that the
matrix of second derivatives is positive definite
Simplest way to check positive definiteness is
eigenvalues: All eigenvalues need to be positive Necessary conditions matrix is positive-semi
definite, all eigenvalues non-negative
1( , )n f x xK
0 0i
f or fx
= =
-
8/4/2019 Survey Optimization Methods
4/61
Types of stationary points
Positive definite: Minimum
Positive semi-definite: possibly minimum
Indefinite: Saddle point
Negative semi-definite: possibly maximum Negative definite: minimum
-
8/4/2019 Survey Optimization Methods
5/61
Example1 22 2
1 1 2 2
1 2
1 2
2 2
21 1 2
2 2
2
1 2 2
1,2
2 2
1 1 2 2
1,
22
Stationary point: 0
Hessian matrix
2 1
1 2
Eigenvalues: 1,3 minimum
Change to 3
Eigenvalues:
x x f x x x x f x x
x x
f f
x x xQ
f f
x x x
f x x x x
+ = + + = +
= =
= =
=
= + +
2 1,5 saddle point=
-
8/4/2019 Survey Optimization Methods
6/61
Unconstrained local minimization
The necessity for one dimensional searches
The most intuitive choice ofsk is the directionof steepest descent
This choice, however is very poor Methods are based on dictum that all
functions of interest are locally quadratic
1k k k+ = +x x s
( )k k kf= = s g x
-
8/4/2019 Survey Optimization Methods
7/61
Conjugate gradients
1
1
T
k kk k k k k T
k k
= + =g g
s g sg g
-
8/4/2019 Survey Optimization Methods
8/61
Newton and quasi-Newton methods
Newton
Quasi-Newton methods use successive
evaluations of gradients to obtain
approximation to Hessian or its inverse
11k k k k k kQ Q+ = + = g g x x s g
-
8/4/2019 Survey Optimization Methods
9/61
Optimization with Constraints
Standard formulation in Elements of
Structural Optimization
-
8/4/2019 Survey Optimization Methods
10/61
Treatment by derivative-based
optimizers Following constraint boundaries: Gradient
projection (Section 5.5) Steering away from constraints (Feasible
directions, Section 5.6)
Using penalties to convert to unconstrainedproblem (Section 5.7)
Combining penalty with Lagrange multipliers(Section 5.8)
Projected Lagrangian methods (Section 5.9)
-
8/4/2019 Survey Optimization Methods
11/61
Gradient projection and reduced
gradient methods Find good direction in space tangent to activeconstraints
Move a distance and then restore to constraintboundaries
-
8/4/2019 Survey Optimization Methods
12/61
The Feasible Directions method
Compromise constraint avoidance and objective reduction
-
8/4/2019 Survey Optimization Methods
13/61
Penalty function methods Quadratic penalty function
Gradual rise of penalty parameter leads tosequence of unconstrained minimization
technique (SUMT). Why is it important?
-
8/4/2019 Survey Optimization Methods
14/61
Example 5.7.1
2 21 2
1 2
2 21 2 1 2
1 2
Minimize 10
Such that 4
Augmented function = 10 (4 )
40 4Solution
10 11 10 11
f x x
x x
x x r x x
r rx x
r r
= ++ =
+ +
= =+ +
-
8/4/2019 Survey Optimization Methods
15/61
Contours for r=1
1 1.5 2 2.5 3 3.5 40.2
0.3
0.4
0.5
0.6
0.7
0.8
-
8/4/2019 Survey Optimization Methods
16/61
Contours for r=10 at 12.5:2.5:75
.
1 1.5 2 2.5 3 3.5 40.2
0.3
0.4
0.5
0.6
0.7
0.8
-
8/4/2019 Survey Optimization Methods
17/61
Contours for r=100 at [15:5:150]
.
1 1.5 2 2.5 3 3.5 40.2
0.3
0.4
0.5
0.6
0.7
0.8
-
8/4/2019 Survey Optimization Methods
18/61
Contours for r=1000 at [15:5:150]
.
1 1.5 2 2.5 3 3.5 4
0.2
0.3
0.4
0.5
0.6
0.7
0.8
-
8/4/2019 Survey Optimization Methods
19/61
Multiplier methods
By adding the Lagrange multipliers to
penalty term can avoid ill-conditioningassociated with high penalty
-
8/4/2019 Survey Optimization Methods
20/61
Can estimate Lagrange multipliers
Stationarity
Without penalty
So: Iterate
See example in textbook
-
8/4/2019 Survey Optimization Methods
21/61
5.9: Projected Lagrangian methods
Sequential quadratic programming
Convert to
Find alpha by minimizing
M l b f i
-
8/4/2019 Survey Optimization Methods
22/61
Matlab fmincon
FMINCON attempts to solve problems of the form:
min F(X) subject to: A*X
-
8/4/2019 Survey Optimization Methods
23/61
Algebraic Example
Quadratic function and constraint
function f=quad2(x,a)
f=x(1)^2+a*x(2)^2;Function [c,ceq]=ring(x,ri,ro)
c(1)=ri-x(1)^2-x(2)^2;
c(2)=x(1)^2+x(2)^2-ro;ceq=[];
x0=[1,10]; a=10;r1=10.; r2=20;
[x,fval]=fmincon(@(x)quad2(x,a),x0,[],[],[],[],[],[],@(x)ring(x,r1,r2))
x =3.1623 -0.0000 fval =10.0000
2 2
1 2
2 2
1 2
1
.i o
Min f x ax a
s t r x x r
= + >
+
-
8/4/2019 Survey Optimization Methods
24/61
Making life hard for fmincon
a=1.1;r1=10; r2=20;
[x,fval]=fmincon(@(x)quad2(x,a),x0,[],[],[],[],[],[],@(x)ring(x,r
1,r2))Maximum number of function evaluations exceeded;
increase OPTIONS.MaxFunEvals.
x =2.2044 2.3249fval =10.8052
Increasing MaxFunEvals does not seem to help!
-
8/4/2019 Survey Optimization Methods
25/61
Global search algorithms
Local algorithms zoom in on optima based
on known information
Global algorithms must also have a
component of exploring new regions in
design space The key to global optimization is therefore
the balance between exploration andexploitation
Many accomplish that based on population
-
8/4/2019 Survey Optimization Methods
26/61
DIRECT Algorithm
Jones, D.R., Perttunen, C.D. and Stuckman, B.E. (1993),
Lipschitzan optimization without the Lipschitz constant,
Journal of Optimization Theory and Application79, 157181.
S. E. COX, R. T. HAFTKA, C. A. BAKER, B. GROSSMAN,
W. H. MASON and L. T. WATSON A Comparison ofGlobal Optimization Methods for the Design of a High-
speed Civil Transport, Journal of Global Optimization 21:
415433, 2001.
-
8/4/2019 Survey Optimization Methods
27/61
DIRECT Box Division
.
-
8/4/2019 Survey Optimization Methods
28/61
DIRECT
The function value at
the middle of each box
and its largest
dimension are used todetermine potentially
optimal boxes
Each potentiallyoptimal box is divided
-
8/4/2019 Survey Optimization Methods
29/61
Exploration vs. Exploitation
DIRECT uses convex hull of box sizes to
balance exploitation vs. exploration With enough function evaluations everyregion in design space will be explored
This is clearly not feasible for highdimensional spaces
Coxs paper compares DIRECT torepeated local optimization with randomstart
-
8/4/2019 Survey Optimization Methods
30/61
Results
0
50000
100000
150000
200000
5 DV
Case
10 DV
Case
26 DV
Case
Function Calls
DOT
LFOPCV3
DIRECT
-
8/4/2019 Survey Optimization Methods
31/61
Results
26 DV case
DOT DIRECT
-
8/4/2019 Survey Optimization Methods
32/61
The Particle Swarm
Optimization Algorithm
Jaco F. Schutte
-
8/4/2019 Survey Optimization Methods
33/61
Particle Swarm Optimizer
Introduced by Kennedy & Eberhart 1995
Inspired by social behavior and movement
dynamics of insects, birds and fish
Global gradient-less stochastic search method
Suited to continuous variable problems Performance comparable to Genetic algorithms
Has successfully been applied to a wide variety
of problems (Neural Networks, Structural opt.,
Shape topology opt.)
-
8/4/2019 Survey Optimization Methods
34/61
Particle Swarm Optimizer
Advantages
Insensitive to scaling of design variables
Simple implementation
Easily parallelized for concurrent processing
Derivative free Very few algorithm parameters
Very efficient global search algorithm
Disadvantages Slow convergence in refined search stage (weak
local search ability)
-
8/4/2019 Survey Optimization Methods
35/61
Particle swarm optimization algorithm
Basic algorithm as proposed by Kennedy and Eberhart (1995)
- Particle position
- Particle velocity
- Best "remembered" individual particle position
- Best "remembered" swarm position
- Cognitive and social parameters
- Random numbers between 0 and 1
Position of individual particles updated as follows:
with the velocity calculated as follows:
-
8/4/2019 Survey Optimization Methods
36/61
Branin-Hoo example
-
8/4/2019 Survey Optimization Methods
37/61
Branin-Hoo example
( ) (2
2
25.1 5 1( , ) 10 - - 6 10 1- cos( )84
[ 5,10], [0,15]
= + + +
x x f x y y x
x y
-
8/4/2019 Survey Optimization Methods
38/61
PSO on structural sizing problems
-
8/4/2019 Survey Optimization Methods
39/61
Accommodation of constraints
-
8/4/2019 Survey Optimization Methods
40/61
Non-convex 10-bar truss
Genetic Algorithms
-
8/4/2019 Survey Optimization Methods
41/61
Genetic Algorithms
Like PSO, genetic algorithms imitate naturaloptimization process, natural selection in evolution
Algorithm developed by John Holland at theUniversity of Michigan for machine learning in 1975
Similar algorithms developed in Europe in the
1970s under the name evolutionary strategies Main difference has been in the nature of thevariables: Discrete vs. continuous
Class is called evolutionary algorithms See material in Chapter 5 ofDesign and
Optimization of Laminated Composite Material,
Gurdal, Haftka and Hajela, Wiley, 1999.
Basic Scheme
-
8/4/2019 Survey Optimization Methods
42/61
Basic Scheme
Coding: replace design variables with acontinuous string of digits or genes
Binary
Integer
Real
Population: Create population of designpoints
Selection: Select parents based on fitness
Crossover: Create child designs
Mutation: Mutate child designs
Coding
-
8/4/2019 Survey Optimization Methods
43/61
g
Integer variables are easily coded as theyare or converted to binary digits
Real variables require more care
Key question is resolution or interval
The numbermof required digits foundfrom
Fit
-
8/4/2019 Survey Optimization Methods
44/61
Fitness
Augmented objective
f*=f + pv-bm+sign(v) . v = max violation
m = min margin
Repair may be more efficient than penalty
Fitness is normalized objective or ns-1-rank
S l ti
-
8/4/2019 Survey Optimization Methods
45/61
Selection
Roulette wheel and tournament based
selection Elitist and superelitist strategies
Selection pressures versus exploration No twin rule
R l tt h l
-
8/4/2019 Survey Optimization Methods
46/61
Roulette wheel
Example fitnesses
Single Point Crossover
-
8/4/2019 Survey Optimization Methods
47/61
Single Point Crossover
Parent designs [04/452/902]s and [454/02]s
Parent 1 [1/1/2/2/ 3]
Parent 2 [2/2/2/2/ 1] One child [1/1/2/2/1]
That is: [04/452/02]s
Genetic operators
-
8/4/2019 Survey Optimization Methods
48/61
Genetic operators
Crossover: portions of strings of the two parents are
exchanged
Mutation: the value of one bit (gene) is changed at random
Permutation: the order of a portion of the chromosome isreversed
Addition/deletion: one gene is added to/removed from thechromosome
Algorithm of standard GA
-
8/4/2019 Survey Optimization Methods
49/61
Algorithm of standard GA
Select parentsSelect parentsCreate initialpopulationCreate initialpopulation
Calculatefitness
Calculatefitness
40
100
30
70
Create childrenCreate children
-
8/4/2019 Survey Optimization Methods
50/61
Elitist Non-dominated SortingGenetic Algorithm: NSGA-II
Tushar Goel
Multi-objective optimization
-
8/4/2019 Survey Optimization Methods
51/61
problem Problems with more than one objectives typically conflicting objectives
Cars: Luxury vs. Price Mathematical formulation
Minimize F(x),
where F(x) = {fi: i = 1, M},
x= {xj: j = 1, N}
Subject to:C(x) 0, where C = {Ck: k = 1, P}
H(x) = 0, where H = {Hl
: l = 1, Q}
Pareto optimal front
-
8/4/2019 Survey Optimization Methods
52/61
Pareto optimal front
Many optimal
solutions
Usual approaches:weighted sum
strategy, -constraint
modeling, Multi-objective GA
Algorithmrequirements:
Convergence
Spread
Min
f2
Min f1
Terminology
-
8/4/2019 Survey Optimization Methods
53/61
Terminologyf2
f1
Non-domination
criterion Ranking
Terminology
-
8/4/2019 Survey Optimization Methods
54/61
Terminology
Niching parametric
Crowding distance c = a + b
Ends have infinite
crowding distance
f2
f1
a
b
Flowchart of NSGA-II
-
8/4/2019 Survey Optimization Methods
55/61
Flowchart of NSGA IIBegin: initialize
population (size N)Evaluate objective
functions
Selection
Crossover
Mutation
Evaluate objective
functionStopping
criteriamet?Yes
No
Childp
opulationcreated
Rankpopulation
Combine parent and
child populations,
rank population
Select N
individuals
Elitism
Report final
population and
Stop
Example: Bicycle Frame Design
-
8/4/2019 Survey Optimization Methods
56/61
Example: Bicycle Frame Design Objectives
Minimize area
Minimize max.
deflection
Constraints
Component should be avalid geometry
Maximum stress < Yield
stress
Maximum deflection