matlab optimization toolbox
DESCRIPTION
MATLAB Optimization Toolbox. Presented by Chin Pei February 28, 2003. Presentation Outline. Introduction Function Optimization Optimization Toolbox Routines / Algorithms available Minimization Problems Unconstrained Constrained Example The Algorithm Description - PowerPoint PPT PresentationTRANSCRIPT
MATLAB Optimization Toolbox
Presented byChin Pei
February 28, 2003
Presentation Outline Introduction
Function Optimization Optimization Toolbox Routines / Algorithms available
Minimization Problems Unconstrained Constrained
Example The Algorithm Description
Multiobjective Optimization Optimal PID Control Example
Function Optimization Optimization concerns the minimization
or maximization of functions Standard Optimization Problem
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
~
~min
xf x
~
0jg x
~
0ih x
L Uk k kx x x
Equality ConstraintsSubject to:
Inequality Constraints
Side Constraints
Function Optimization
~
f xis the objective function, which measure and evaluate the performance of a system.In a standard problem, we are minimizing the function.For maximization, it is equivalent to minimization of the –ve of the objective function.
~xis a column vector of design variables, which canaffect the performance of the system.
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Function Optimization
Constraints – Limitation to the design space.Can be linear or nonlinear, explicit or implicit functions
~
0jg x
~
0ih x
L Uk k kx x x
Equality Constraints
Inequality Constraints
Side Constraints
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Most algorithm require less than!!!
Optimization Toolbox Is a collection of functions that extend the
capability of MATLAB. The toolbox includes routines for: Unconstrained optimization Constrained nonlinear optimization, including
goal attainment problems, minimax problems, and semi-infinite minimization problems
Quadratic and linear programming Nonlinear least squares and curve fitting Nonlinear systems of equations solving Constrained linear least squares Specialized algorithms for large scale problems
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Minimization Algorithm
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Minimization Algorithm (Cont.)
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Equation Solving Algorithms
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Least-Squares Algorithms
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Implementing Opt. Toolbox Most of these optimization routines
require the definition of an M-file containing the function, f, to be minimized.
Maximization is achieved by supplying the routines with –f.
Optimization options passed to the routines change optimization parameters.
Default optimization parameters can be changed through an options structure.
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Unconstrained Minimization
Consider the problem of finding a set of values [x1 x2]T that solves
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
1
~
2 21 2 1 2 2
~min 4 2 4 2 1x
xf x e x x x x x
1 2~
Tx x x
Steps
Create an M-file that returns the function value (Objective Function) Call it objfun.m
Then, invoke the unconstrained minimization routine Use fminunc
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Step 1 – Obj. Function
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
function f = objfun(x)
f=exp(x(1))*(4*x(1)^2+2*x(2)^2+4*x(1)*x(2)+2*x(2)+1);
1 2~
Tx x x
Objective function
Step 2 – Invoke Routine
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
x0 = [-1,1];
options = optimset(‘LargeScale’,’off’);
[xmin,feval,exitflag,output]=
fminunc(‘objfun’,x0,options);
Output argumentsInput arguments
Starting with a guess
Optimization parameters settings
Results
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
xmin =
0.5000 -1.0000
feval =
1.3028e-010
exitflag =
1
output =
iterations: 7
funcCount: 40
stepsize: 1
firstorderopt: 8.1998e-004
algorithm: 'medium-scale: Quasi-Newton line search'
Minimum point of design variables
Objective function value
Exitflag tells if the algorithm is converged.If exitflag > 0, then local minimum is found
Some other information
More on fminunc – Input
[xmin,feval,exitflag,output,grad,hessian]=
fminunc(fun,x0,options,P1,P2,…)
fun: Return a function of objective function. x0: Starts with an initial guess. The guess must be a vector of
size of number of design variables. option: To set some of the optimization parameters. (More
after few slides) P1,P2,…: To pass additional parameters.
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Ref. Manual: Pg. 5-5 to 5-9
More on fminunc – Output
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
[xmin,feval,exitflag,output,grad,hessian]=
fminunc(fun,x0,options,P1,P2,…)
xmin: Vector of the minimum point (optimal point). The size is the number of design variables.
feval: The objective function value of at the optimal point. exitflag: A value shows whether the optimization routine is
terminated successfully. (converged if >0) output: This structure gives more details about the
optimization grad: The gradient value at the optimal point. hessian: The hessian value of at the optimal pointRef. Manual: Pg. 5-5 to 5-9
Options Setting – optimset
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Options =
optimset(‘param1’,value1, ‘param2’,value2,…)
The routines in Optimization Toolbox has a set of default optimization parameters.
However, the toolbox allows you to alter some of those parameters, for example: the tolerance, the step size, the gradient or hessian values, the max. number of iterations etc.
There are also a list of features available, for example: displaying the values at each iterations, compare the user supply gradient or hessian, etc.
You can also choose the algorithm you wish to use.
Ref. Manual: Pg. 5-10 to 5-14
Options Setting (Cont.)
Options =
optimset(‘param1’,value1, ‘param2’,value2,…)
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Type help optimset in command window, a list of options setting available will be displayed.
How to read? For example:
LargeScale - Use large-scale algorithm if possible [ {on} | off ]
The default is with { }
Parameter (param1)Value (value1)
Options Setting (Cont.)
LargeScale - Use large-scale algorithm if possible [ {on} | off ] Since the default is on, if we would like to turn off, we just type:
Options =
optimset(‘LargeScale’, ‘off’)
Options =
optimset(‘param1’,value1, ‘param2’,value2,…)
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
and pass to the input of fminunc.
Useful Option Settings
Display - Level of display [ off | iter | notify | final ]
MaxIter - Maximum number of iterations allowed [ positive integer ]
TolCon - Termination tolerance on the constraint violation [ positive scalar ]
TolFun - Termination tolerance on the function value [ positive scalar ]
TolX - Termination tolerance on X [ positive scalar ]
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Ref. Manual: Pg. 5-10 to 5-14
Highly recommended to use!!!
fminunc and fminsearch fminunc uses algorithm with
gradient and hessian information. Two modes:
Large-Scale: interior-reflective Newton Medium-Scale: quasi-Newton (BFGS)
Not preferred in solving highly discontinuous functions.
This function may only give local solutions.
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
fminunc and fminsearch fminsearch is generally less efficient than fminunc for problems of order greater than two. However, when the problem is highly discontinuous, fminsearch may be more robust.
This is a direct search method that does not use numerical or analytic gradients as in fminunc.
This function may only give local solutions.
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Constrained Minimization
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
[xmin,feval,exitflag,output,lambda,grad,hessian] =
fmincon(fun,x0,A,B,Aeq,Beq,LB,UB,NONLCON,options,P1,P2,…)
Vector of LagrangeMultiplier at optimal point
Example
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
~
1 2 3~
minx
f x x x x
21 22 0x x
1 2 3
1 2 3
2 2 0
2 2 72
x x x
x x x
1 2 30 , , 30x x x
Subject to:
1 2 2 0,
1 2 2 72A B
0 30
0 , 30
0 30
LB UB
function f = myfun(x)
f=-x(1)*x(2)*x(3);
Example (Cont.)
21 22 0x x For
Create a function call nonlcon which returns 2 constraint vectors [C,Ceq]
function [C,Ceq]=nonlcon(x)
C=2*x(1)^2+x(2);Ceq=[];
Remember to return a nullMatrix if the constraint doesnot apply
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Example (Cont.)
x0=[10;10;10];
A=[-1 -2 -2;1 2 2];
B=[0 72]';
LB = [0 0 0]';
UB = [30 30 30]';
[x,feval]=fmincon(@myfun,x0,A,B,[],[],LB,UB,@nonlcon)
1 2 2 0,
1 2 2 72A B
Initial guess (3 design variables)
CAREFUL!!!
fmincon(fun,x0,A,B,Aeq,Beq,LB,UB,NONLCON,options,P1,P2,…)
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
0 30
0 , 30
0 30
LB UB
Example (Cont.)Warning: Large-scale (trust region) method does not currently solve this type of problem, switching to medium-scale (line search).
> In D:\Programs\MATLAB6p1\toolbox\optim\fmincon.m at line 213
In D:\usr\CHINTANG\OptToolbox\min_con.m at line 6
Optimization terminated successfully:
Magnitude of directional derivative in search direction less than 2*options.TolFun and maximum constraint violation is less than options.TolCon
Active Constraints:
2
9
x =
0.00050378663220
0.00000000000000
30.00000000000000
feval =
-4.657237250542452e-035
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
21 22 0x x
1 2 3
1 2 3
2 2 0
2 2 72
x x x
x x x
1
2
3
0 30
0 30
0 30
x
x
x
Const. 1
Const. 2
Const. 3
Const. 4
Const. 5
Const. 6
Const. 7
Sequence: A,B,Aeq,Beq,LB,UB,C,Ceq
Const. 8
Const. 9
Multiobjective Optimization
Previous examples involved problems with a single objective function.
Now let us look at solving problem with multiobjective function by lsqnonlin.
Example is taken by designing an optimal PID controller for an plant.
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective Optimization ConclusionConclusion
Simulink Example
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective Optimization ConclusionConclusion
Goal: Optimize the control parameters in Simulink model optsim.mdl in order to minimize the error between the output and input.
Plant description:• Third order under-damped with actuator limits.• Actuation limits are a saturation limit and a slew rate limit.• Saturation limit cuts off input: +/- 2 units• Slew rate limit: 0.8 unit/sec
Simulink Example (Cont.)
Initial PID Controller Design
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective Optimization ConclusionConclusion
Solving Methodology
Design variables are the gains in PID controller (KP, KI and KD) .
Objective function is the error between the output and input.
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective Optimization ConclusionConclusion
Solving Methodology (Cont.)
Let pid = [Kp Ki Kd]T
Let also the step input is unity. F = yout - 1
Construct a function tracklsq for objective function.
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective Optimization ConclusionConclusion
Objective Function
function F = tracklsq(pid,a1,a2)
Kp = pid(1);
Ki = pid(2);
Kd = pid(3);
% Compute function value
opt = simset('solver','ode5','SrcWorkspace','Current');
[tout,xout,yout] = sim('optsim',[0 100],opt);
F = yout-1;
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective Optimization ConclusionConclusion
Getting the simulationdata from Simulink
The idea is perform nonlinear least squaresminimization of the errors from time 0 to 100at the time step of 1.So, there are 101 objective functions to minimize.
The lsqnonlin
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective Optimization ConclusionConclusion
[X,RESNORM,RESIDUAL,EXITFLAG,OUTPUT,LAMBDA,JACOBIAN]= LSQNONLIN(FUN,X0,LB,UB,OPTIONS,P1,P2,..)
Invoking the Routine
clear all
Optsim;
pid0 = [0.63 0.0504 1.9688];
a1 = 3; a2 = 43;
options = optimset('LargeScale','off','Display','iter','TolX',0.001,'TolFun',0.001);
pid = lsqnonlin(@tracklsq,pid0,[],[],options,a1,a2)
Kp = pid(1); Ki = pid(2); Kd = pid(3);
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective Optimization ConclusionConclusion
Results
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective Optimization ConclusionConclusion
Optimal gains
Results (Cont.)
Initial Design
Optimization Process Optimal Controller Result
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective Optimization ConclusionConclusion
Conclusion Easy to use! But, we do not know what is happening
behind the routine. Therefore, it is still important to understand the limitation of each routine.
Basic steps: Recognize the class of optimization problem Define the design variables Create objective function Recognize the constraints Start an initial guess Invoke suitable routine Analyze the results (it might not make sense)
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization Conclusion
Thank You!
Questions & Suggestions?