© 2008 Solutions 4U Sdn Bhd. All Rights Reserved
Introduction to Optimization Methods
MATLAB Optimization Toolbox
Optimization Toolbox
© 2008 Solutions 4U Sdn Bhd. All Rights Reserved
Presentation Outline
• Introduction
• Function Optimization
• Optimization Toolbox
• Routines / Algorithms available
• Minimization Problems
• Unconstrained
• Constrained• Example
• The Algorithm Description
• Multiobjective Optimization
• Optimal PID Control Example
Optimization Toolbox
© 2008 Solutions 4U Sdn Bhd. All Rights Reserved
Optimization concerns the minimization or maximization of functions
Standard Optimization Problem:
~
~minxf x
~
0jg x
~
0ih x
L Uk k kx x x
Equality ConstraintsSubject to:
Inequality Constraints
Side Constraints
~
f x is the objective function, which measure and evaluate the performance of a system. In a standard problem, we are minimizing the function. For maximization, it is equivalent to minimization of the –ve of the objective function.
Where:
~x is a column vector of design variables, which can
affect the performance of the system.
Function Optimization
Optimization Toolbox
© 2008 Solutions 4U Sdn Bhd. All Rights Reserved
~
0ih x
L Uk k kx x x
Equality Constraints
Inequality Constraints
Side Constraints
~
0jg x Most algorithm require less than!!!
Constraints – Limitation to the design space. Can be linear or nonlinear, explicit or implicit functions
Function Optimization (Cont.)
Optimization Toolbox
© 2008 Solutions 4U Sdn Bhd. All Rights Reserved
Is a collection of functions that extend the capability of MATLAB. The toolbox includes routines for:
• Unconstrained optimization
• Constrained nonlinear optimization, including goal attainment problems, minimax problems, and semi-infinite minimization problems
• Quadratic and linear programming• Nonlinear least squares and curve fitting
• Nonlinear systems of equations solving• Constrained linear least squares
• Specialized algorithms for large scale problems
Optimization Toolbox
Optimization Toolbox
© 2008 Solutions 4U Sdn Bhd. All Rights Reserved
Minimization Algorithm
Optimization Toolbox
© 2008 Solutions 4U Sdn Bhd. All Rights Reserved
Minimization Algorithm (Cont.)
Optimization Toolbox
© 2008 Solutions 4U Sdn Bhd. All Rights Reserved
Equation Solving Algorithms
Optimization Toolbox
© 2008 Solutions 4U Sdn Bhd. All Rights Reserved
Least-Squares Algorithms
Optimization Toolbox
© 2008 Solutions 4U Sdn Bhd. All Rights Reserved
Most of these optimization routines require the definition of an M- file containing the function, f, to be minimized. Maximization is achieved by supplying the routines with –f. Optimization options passed to the routines change optimization parameters.
Default optimization parameters can be changed through an options structure.
Implementing Opt. Toolbox
Optimization Toolbox
© 2008 Solutions 4U Sdn Bhd. All Rights Reserved
Consider the problem of finding a set of values [x1 x2]T that solves
1
~
2 21 2 1 2 2
~min 4 2 4 2 1x
xf x e x x x x x
1 2~
Tx x x
Steps:
• Create an M-file that returns the function value (Objective Function). Call it objfun.m
• Then, invoke the unconstrained minimization routine. Use fminunc
Unconstrained Minimization
Optimization Toolbox
© 2008 Solutions 4U Sdn Bhd. All Rights Reserved
Step 1 – Obj. Function
function f = objfun(x)
f=exp(x(1))*(4*x(1)^2+2*x(2)^2+4*x(1)*x(2)+2*x(2)+1);
1 2~
Tx x x
Objective function
Optimization Toolbox
© 2008 Solutions 4U Sdn Bhd. All Rights Reserved
Step 2 – Invoke Routine
x0 = [-1,1];
options = optimset(‘LargeScale’,’off’);
[xmin,feval,exitflag,output]=
fminunc(‘objfun’,x0,options);
Output argumentsInput arguments
Starting with a guess
Optimization parameters settings
Optimization Toolbox
© 2008 Solutions 4U Sdn Bhd. All Rights Reserved
xmin =
0.5000 -1.0000
feval =
1.3028e-010
exitflag =
1
output =
iterations: 7
funcCount: 40
stepsize: 1
firstorderopt: 8.1998e-004
algorithm: 'medium-scale: Quasi-Newton line search'
Minimum point of design variables
Objective function value
Exitflag tells if the algorithm is converged.If exitflag > 0, then local minimum is found
Some other information
Results
Optimization Toolbox
© 2008 Solutions 4U Sdn Bhd. All Rights Reserved
[xmin,feval,exitflag,output,grad,hessian]=
fminunc(fun,x0,options,P1,P2,…)
More on fminunc – Input
fun : Return a function of objective function.
x0 : Starts with an initial guess. The guess must be a vector
of size of number of design variables.
Option : To set some of the optimization parameters. (More after few slides)
P1,P2,… : To pass additional parameters.
Optimization Toolbox
© 2008 Solutions 4U Sdn Bhd. All Rights Reserved
[xmin,feval,exitflag,output,grad,hessian]=
fminunc(fun,x0,options,P1,P2,…)
xmin : Vector of the minimum point (optimal point). The size is the number of design variables.
feval : The objective function value of at the optimal point.
exitflag : A value shows whether the optimization routine is terminated successfully. (converged if >0)
Output : This structure gives more details about the optimization
grad : The gradient value at the optimal point.
hessian : The hessian value of at the optimal point
More on fminunc – Output
Optimization Toolbox
© 2008 Solutions 4U Sdn Bhd. All Rights Reserved
Options =
optimset(‘param1’,value1, ‘param2’,value2,…)
Options Setting – optimset
The routines in Optimization Toolbox has a set of default optimization parameters. However, the toolbox allows you to alter some of those parameters, for example: the tolerance, the step size, the gradient or hessian values, the max. number of iterations etc. There are also a list of features available, for example: displaying the values at each iterations, compare the user supply gradient or hessian, etc. You can also choose the algorithm you wish to use.
Optimization Toolbox
© 2008 Solutions 4U Sdn Bhd. All Rights Reserved
Options =
optimset(‘param1’,value1, ‘param2’,value2,…)
LargeScale - Use large-scale algorithm if possible [ {on} | off ]
The default is with { }
Parameter (param1)Value (value1)
Options Setting (Cont.)
Type help optimset in command window, a list of options setting available will be displayed. How to read? For example:
Optimization Toolbox
© 2008 Solutions 4U Sdn Bhd. All Rights Reserved
LargeScale - Use large-scale algorithm if possible [ {on} | off ]• Since the default is on, if we would like to turn off, we just type:
Options = optimset(‘LargeScale’, ‘off’)
Options =
optimset(‘param1’,value1, ‘param2’,value2,…)
and pass to the input of fminunc.
Options Setting (Cont.)
Optimization Toolbox
© 2008 Solutions 4U Sdn Bhd. All Rights Reserved
Display - Level of display [ off | iter | notify | final ]
MaxIter - Maximum number of iterations allowed [ positive integer ]
TolCon - Termination tolerance on the constraint violation [ positive scalar ]
TolFun - Termination tolerance on the function value [ positive scalar ]
TolX - Termination tolerance on X [ positive scalar ]
Highly recommended to use!!!
Useful Option Settings
Optimization Toolbox
© 2008 Solutions 4U Sdn Bhd. All Rights Reserved
fminunc and fminsearch
fminunc uses algorithm with gradient and hessian information. Two modes:
• Large-Scale: interior-reflective Newton• Medium-Scale: quasi-Newton (BFGS)
Not preferred in solving highly discontinuous functions.
This function may only give local solutions.. fminsearch is generally less efficient than fminunc for problems of order greater than two. However, when the problem is highly discontinuous, fminsearch may be more robust.
This is a direct search method that does not use numerical or analytic gradients as in fminunc. This function may only give local solutions.
Optimization Toolbox
© 2008 Solutions 4U Sdn Bhd. All Rights Reserved
[xmin,feval,exitflag,output,lambda,grad,hessian] =
fmincon(fun,x0,A,B,Aeq,Beq,LB,UB,NONLCON,options,P1,P2,…)
Vector of LagrangeMultiplier at optimal point
Constrained Minimization
Optimization Toolbox
© 2008 Solutions 4U Sdn Bhd. All Rights Reserved
~
1 2 3~
minxf x x x x
21 22 0x x
1 2 3
1 2 3
2 2 0
2 2 72
x x x
x x x
1 2 30 , , 30x x x
Subject to:
1 2 2 0,
1 2 2 72A B
0 30
0 , 30
0 30
LB UB
function f = myfun(x)
f=-x(1)*x(2)*x(3);
Example
Optimization Toolbox
© 2008 Solutions 4U Sdn Bhd. All Rights Reserved
21 22 0x x For
Create a function call nonlcon which returns 2 constraint vectors [C,Ceq]
function [C,Ceq]=nonlcon(x)
C=2*x(1)^2+x(2);Ceq=[];
Remember to return a nullMatrix if the constraint doesnot apply
Example (Cont.)
Optimization Toolbox
© 2008 Solutions 4U Sdn Bhd. All Rights Reserved
x0=[10;10;10];
A=[-1 -2 -2;1 2 2];
B=[0 72]';
LB = [0 0 0]';
UB = [30 30 30]';
[x,feval]=fmincon(@myfun,x0,A,B,[],[],LB,UB,@nonlcon)
1 2 2 0,
1 2 2 72A B
Initial guess (3 design variables)
CAREFUL!!!
fmincon(fun,x0,A,B,Aeq,Beq,LB,UB,NONLCON,options,P1,P2,…)
0 30
0 , 30
0 30
LB UB
Example (Cont.)
Optimization Toolbox
© 2008 Solutions 4U Sdn Bhd. All Rights Reserved
Warning: Large-scale (trust region) method does not currently solve this type of problem, switching to medium-scale (line search).
>
Optimization terminated successfully:
Magnitude of directional derivative in search direction less than 2*options.TolFun and maximum constraint violation is less than options.TolCon
Active Constraints:
2
9
x =
0.00050378663220
0.00000000000000
30.00000000000000
feval =
-4.657237250542452e-035
21 22 0x x
1 2 3
1 2 3
2 2 0
2 2 72
x x x
x x x
1
2
3
0 30
0 30
0 30
x
x
x
Const. 1
Const. 2
Const. 3
Const. 4
Const. 5
Const. 6
Const. 7
Sequence: A,B,Aeq,Beq,LB,UB,C,Ceq
Const. 8
Const. 9
Example (Cont.)
Optimization Toolbox
© 2008 Solutions 4U Sdn Bhd. All Rights Reserved
Previous examples involved problems with a single objective function.
Now let us look at solving problem with multiobjective function by lsqnonlin.
Example is taken for data curve fitting
In curve fitting problem the the error is reduced at each time step producing multiobjective function.
Multiobjective Optimization
lsqnonlin in Matlab – Curve fitting
)2(1
0986.1
1
1bx
beR
clc; %recfit.mclear; global data; data= [ 0.6000 0.999
0.6500 0.998 0.7000 0.997 0.7500 0.995 0.8000 0.982 0.8500 0.975 0.9000 0.932 0.9500 0.862 1.0000 0.714 1.0500 0.520 1.1000 0.287 1.1500 0.134 1.2000 0.0623 1.2500 0.0245 1.3000 0.0100 1.3500 0.0040 1.4000 0.0015 1.4500 0.0007 1.5000 0.0003 ]; % experimental data,`1st coloum x, 2nd coloum R
x=data(:,1); Rexp=data(:,2); plot(x,Rexp,'ro'); % plot the experimental data hold on b0=[1.0 1.0]; % start values for the parameters b=lsqnonlin('recfun',b0) % run the lsqnonlin with start value b0, returned parameter values stored in b Rcal=1./(1+exp(1.0986/b(1)*(x-b(2)))); % calculate the fitted value with parameter b plot(x,Rcal,'b'); % plot the fitted value on the same graph
Find b1 and b2
>>recfit
>>b =
0.0603 1.0513
%recfun.mfunction y=recfun(b) global data; x=data(:,1); Rexp=data(:,2); Rcal=1./(1+exp(1.0986/b(1)*(x-b(2)))); % the calculated value from the model %y=sum((Rcal-Rexp).^2); y=Rcal-Rexp; % the sum of the square of the difference %between calculated value and experimental value
Optimization Toolbox
© 2008 Solutions 4U Sdn Bhd. All Rights Reserved
Numerical Optimization
Newton –Raphson Method1.Root Solver – System of nonlinear equations 2.(MATLAB – Optimization – fsolve)
)('
)(
11
i
iii xf
xfxx
2. One dimensional Solver – (MATLAB - OPTIMIZATION Method )
)(''
)('
11
i
iii xf
xfxx
Steepest Gradient Ascent/Descent Methods
iii
iii
dxfxf
dxfx
).(.)(
).(1
ii dxf ).( Is the magnitude of descent direction.
)(2 ixfd achieve the steepest descent )(2 ixfd achieve steepest
ascent
Optimization Toolbox
© 2008 Solutions 4U Sdn Bhd. All Rights Reserved
Steepest Gradient
75.0)1(25.2)1(325.23
yxx
f
075.1)1(4)1(25.275.1425.2
yxy
f
284375.05625.05.0)1 ,75.01( hhhf
0)1(25.2)75.0(3 x
f
5625.075.1)1(4)75.0(25.2 y
f
263281.0316406.059375.0)5625.01 ,75.0( hhhf
The partial derivatives can be evaluated at the initial guesses, x = 1 and y = 1,
Therefore, the search direction is –0.75i.
This can be differentiated and set equal to zero and solved for h* = 0.33333. Therefore, the result for the first iteration is x = 1 – 0.75(0.3333) = 0.75 and y = 1 + 0(0.3333) = 1. For the second iteration, the partial derivatives can be evaluated as,
Therefore, the search direction is –0.5625j.
This can be differentiated and set equal to zero and solved for h* = 0.25. Therefore, the result for the second iteration is x = 0.75 + 0(0.25) = 0.75 and y = 1 + (–0.5625)0.25 = 0.859375.
Solve the following for two step steepest ascent
yxyxyyxf 25.175.125.2),( 2
0 0.2 0.4 0.6 0.8 1 1.20
0.2
0.4
0.6
0.8
1
1.2
0
2
1
max
Optimization Toolbox
© 2008 Solutions 4U Sdn Bhd. All Rights Reserved
%chapra14.5 … ContdclearclcClf x2
ww1=0:0.01:1.2; ww2=ww1; [w1,w2]=meshgrid(ww1,ww2); J=-1.5*w1.^2+2.25*w2.*w1-2*w2.^2+1.75*w2;cs=contour(w1,w2,J,70);%clabel(cs);holdgridw1=1; w2=1; h=0;for i=1:10 syms h dfw1=-3*w1(i)+2.25*w2(i); dfw2=2.25*w1(i)-4*w2(i)+1.75; fw1=-1.5*(w1(i)+dfw1*h).^2 + 2.25*(w2(i)+dfw2*h).*(w1(i)+… dfw1*h)-2*(w2(i)+dfw2*h).^2+1.75*(w2(i)+dfw2*h); J=-1.5*w1(i)^2+2.25*w2(i)*w1(i)-2*w2(i)^2+1.75*w2(i) g=solve(fw1); h=sum(g)/2; w1(i+1)=w1(i)+dfw1*h; w2(i+1)=w2(i)+dfw2*h; plot(w1,w2) xi
pause(0.05) End
MATLAB OPTIMIZATION TOOLBOX
w1(i), w2(i) function J=chaprafun(x)w1=x(1);w2=x(2)J=-(-1.5*w1^2+2.25*w2*w1-2*w2^2+1.75*w2);
%startchapra.mclcclearx0=[1 1];options=optimset('LargeScale','off','Display','iter','Maxiter',… 20,'MaxFunEvals',100,'TolX',1e-3,'TolFun',1e-3);[x,fval]=fminunc(@chaprafun,x0,options)
Optimization Toolbox
© 2008 Solutions 4U Sdn Bhd. All Rights Reserved
Newton –Raphson – Four Bar MechanismSine and Cosine angle components - All angles referenced from global x - axis
In above equations θ1 = 0 as it is along the x-axis and other three angles are time varying.
1st angular velocity derivative
Optimization Toolbox
© 2008 Solutions 4U Sdn Bhd. All Rights Reserved
• If input is applied to link2 - DC motor then ω2 would be the input to the syste
In Matrix for
2. Numerical Solution for Non Algebraic Equations
)('
)(
1
_
1
i
iii xf
xfxx
Optimization Toolbox
© 2008 Solutions 4U Sdn Bhd. All Rights Reserved
)('
)(_
ii qf
qfq
Optimization Toolbox
© 2008 Solutions 4U Sdn Bhd. All Rights Reserved
)('
)(_
_
i
iii
f
f
Newton-Raphson
%fouropt.mfunction f=fouropt(x)the = 0; r1=12; r2=4; r3=10; r4=7;f=-[r2*cos(the)+r3*cos(x(1))-r1*cos(0)-r4*cos(x(2)); r2*sin(the)+r3*sin(x(1))-r1*sin(0)-r4*sin(x(2))];
%startfouropt.mclcclearx0=[0.1 0.1];options=optimset('LargeScale','off','Display','iter','Maxiter',… 200,'MaxFunEvals',100,'TolX',1e-8,'TolFun',1e-8);[x,fval]=fsolve(@fouropt,x0,options);theta3=x(1)*57.3theta4=x(2)*57.3
Foursimmechm.m, foursimmech.mdl and possol4.m
Optimization Toolbox
© 2008 Solutions 4U Sdn Bhd. All Rights Reserved
Simulink PID Controller Optimization
Kp=72, Ki=36, Kd=160.2
------------------------- s2 - 0.2 s - 0.1
Y(s) ------- =
U(s)
Step
Signal Constraint
Scope
u y
Plant
e u
PID Controller1
1
uSum
Kp
Proportional
1s
Integrator
Ki
Integral
Kd
Derivative
s
0.1s+1
Band-limitedDerivative
1
e
UNIM513tune1.mdl
Optimization Toolbox
© 2008 Solutions 4U Sdn Bhd. All Rights Reserved