![Page 1: Fitness inheritance in the Bayesian optimization algorithm](https://reader033.vdocument.in/reader033/viewer/2022052522/55500fd9b4c90535638b4971/html5/thumbnails/1.jpg)
1
Fitness Inheritance in BOAFitness Inheritance in BOA
Martin PelikanMartin PelikanDept. of Math and CSDept. of Math and CS
Univ. of Missouri at St. LouisUniv. of Missouri at St. Louis
Kumara Kumara SastrySastryIllinois GA LabIllinois GA Lab
Univ. of Illinois at UrbanaUniv. of Illinois at Urbana--ChampaignChampaign
![Page 2: Fitness inheritance in the Bayesian optimization algorithm](https://reader033.vdocument.in/reader033/viewer/2022052522/55500fd9b4c90535638b4971/html5/thumbnails/2.jpg)
2
MotivationMotivation
Bayesian optimization algorithm (BOA)Bayesian optimization algorithm (BOA)Scales up on decomposable problemsScales up on decomposable problemsO(O(nn))--O(O(nn22) evaluations until convergence) evaluations until convergence
Expensive evaluationsExpensive evaluationsRealReal--world evaluations can be complexworld evaluations can be complexFEA, simulation, …FEA, simulation, …O(O(nn22) is often not enough) is often not enough
This paperThis paperExtend probabilistic model to include fitness infoExtend probabilistic model to include fitness infoUse model to evaluate part of the populationUse model to evaluate part of the population
![Page 3: Fitness inheritance in the Bayesian optimization algorithm](https://reader033.vdocument.in/reader033/viewer/2022052522/55500fd9b4c90535638b4971/html5/thumbnails/3.jpg)
3
OutlineOutline
BOA basicsBOA basicsFitness inheritance in BOAFitness inheritance in BOA
Extend Bayesian networks with fitness.Extend Bayesian networks with fitness.Use extended model for evaluation.Use extended model for evaluation.
ExperimentsExperimentsFuture workFuture workSummary and conclusionsSummary and conclusions
![Page 4: Fitness inheritance in the Bayesian optimization algorithm](https://reader033.vdocument.in/reader033/viewer/2022052522/55500fd9b4c90535638b4971/html5/thumbnails/4.jpg)
4
Bayesian Optimization Bayesian Optimization AlgAlg. (BOA). (BOA)
Pelikan, Goldberg, and CantuPelikan, Goldberg, and Cantu--Paz (1998)Paz (1998)Similar to genetic algorithms (Similar to genetic algorithms (GAsGAs))Replace mutation + crossover byReplace mutation + crossover by
Build Bayesian network to model selected solutions.Build Bayesian network to model selected solutions.Sample Bayesian network to generate new candidate Sample Bayesian network to generate new candidate solutions.solutions.
![Page 5: Fitness inheritance in the Bayesian optimization algorithm](https://reader033.vdocument.in/reader033/viewer/2022052522/55500fd9b4c90535638b4971/html5/thumbnails/5.jpg)
5
BOABOA
Current population Selection
New population
Bayesian network
Restricted tournament replacement
![Page 6: Fitness inheritance in the Bayesian optimization algorithm](https://reader033.vdocument.in/reader033/viewer/2022052522/55500fd9b4c90535638b4971/html5/thumbnails/6.jpg)
6
Bayesian Networks (Bayesian Networks (BNsBNs))
2 components2 componentsStructureStructure
directed acyclic graphdirected acyclic graphnodes = variables (string positions)nodes = variables (string positions)Edges = dependencies between variablesEdges = dependencies between variables
ParametersParametersConditional probabilities Conditional probabilities p(X|Pp(X|Pxx), where ), where
X is a variableX is a variablePPxx are parents of X (variables that X depends on)are parents of X (variables that X depends on)
![Page 7: Fitness inheritance in the Bayesian optimization algorithm](https://reader033.vdocument.in/reader033/viewer/2022052522/55500fd9b4c90535638b4971/html5/thumbnails/7.jpg)
7
BN exampleBN example
A
B C
BB p(Bp(B))00 0.250.2511 0.750.75
AA BB p(A|Bp(A|B))00 00 0.100.1000 11 0.600.6011 00 0.900.9011 11 0.400.40
CC AA p(C|Ap(C|A))00 00 0.800.8000 11 0.550.5511 00 0.200.2011 11 0.450.45
![Page 8: Fitness inheritance in the Bayesian optimization algorithm](https://reader033.vdocument.in/reader033/viewer/2022052522/55500fd9b4c90535638b4971/html5/thumbnails/8.jpg)
8
Extending Extending BNsBNs with fitness infowith fitness info
Basic ideaBasic ideaDon’t work only with Don’t work only with conditional probabilitiesconditional probabilitiesAdd also fitness info for Add also fitness info for fitness estimationfitness estimation
AA BB p(A|B)p(A|B) f(A|B)f(A|B)00 00 0.100.10 --0.50.500 11 0.600.60 0.50.511 00 0.900.90 0.30.311 11 0.400.40 --0.30.3
Fitness info attached to Fitness info attached to p(X|Pp(X|Pxx) denoted by ) denoted by f(X|Pf(X|Pxx))Contribution of X restricted by Contribution of X restricted by PPxx
avg. fitness of solutions with X=x and avg. fitness of solutions with X=x and PPxx==ppxx
avg. fitness of solutions with avg. fitness of solutions with PPxx
f X = x | Px = px( )= f X = x,Px = px( )− f Px = px( )f X = x,Px = px( )
f Px = px( )
![Page 9: Fitness inheritance in the Bayesian optimization algorithm](https://reader033.vdocument.in/reader033/viewer/2022052522/55500fd9b4c90535638b4971/html5/thumbnails/9.jpg)
9
Estimating fitnessEstimating fitness
EquationEquation
In wordsIn wordsFitness = avg. fitness + avg. contribution of each bitFitness = avg. fitness + avg. contribution of each bitAvg. contributions taken w.r.t. context from BNAvg. contributions taken w.r.t. context from BN
f X1, X2 ,K , Xn( )= favg + f Xi | PXi( )
i=0
n
∑
![Page 10: Fitness inheritance in the Bayesian optimization algorithm](https://reader033.vdocument.in/reader033/viewer/2022052522/55500fd9b4c90535638b4971/html5/thumbnails/10.jpg)
10
BNsBNs with decision treeswith decision trees
Local structures in Local structures in BNsBNsMore efficient representation for p ( X | More efficient representation for p ( X | PPxx ))
Example for p ( A | B C )Example for p ( A | B C )
( )0,1| == CBAp
B
C
0 1
10
( )1,1| == CBAp
( )0| =BAp
![Page 11: Fitness inheritance in the Bayesian optimization algorithm](https://reader033.vdocument.in/reader033/viewer/2022052522/55500fd9b4c90535638b4971/html5/thumbnails/11.jpg)
11
BNsBNs with decision trees + fitnesswith decision trees + fitness
Same ideaSame ideaAttach fitness info to Attach fitness info to each probabilityeach probability
)0|( =BAf
)0,1|( == CBAp
B
C
0 1
10
)1,1|( == CBAp
)0|( =BAp
)0,1|( == CBAf )1,1|( == CBAf
![Page 12: Fitness inheritance in the Bayesian optimization algorithm](https://reader033.vdocument.in/reader033/viewer/2022052522/55500fd9b4c90535638b4971/html5/thumbnails/12.jpg)
12
Estimating fitness againEstimating fitness again
Same as before…because both Same as before…because both BNsBNs represent the samerepresent the sameEquationEquation
In wordsIn wordsFitness = avg. fitness + avg. contribution of each bitFitness = avg. fitness + avg. contribution of each bitAvg. contributions taken Avg. contributions taken w.r.tw.r.t. context from BN. context from BN
( )∑=
+=n
iXiavgn i
PXffXXXf0
21 |),,,( K
![Page 13: Fitness inheritance in the Bayesian optimization algorithm](https://reader033.vdocument.in/reader033/viewer/2022052522/55500fd9b4c90535638b4971/html5/thumbnails/13.jpg)
13
Where to learn fitness from?Where to learn fitness from?
Evaluate entire initial populationEvaluate entire initial populationChoose inheritance proportion, pChoose inheritance proportion, pii
After thatAfter thatEvaluate (1Evaluate (1--ppii) proportion of offspring) proportion of offspringUse evaluated parents + evaluated offspring to learnUse evaluated parents + evaluated offspring to learnEstimate fitness of the remaining prop. pEstimate fitness of the remaining prop. pii
Sample for learning: N(1Sample for learning: N(1--ppii) to N+N(1) to N+N(1--ppii))Often, 2N(1Often, 2N(1--ppii))
![Page 14: Fitness inheritance in the Bayesian optimization algorithm](https://reader033.vdocument.in/reader033/viewer/2022052522/55500fd9b4c90535638b4971/html5/thumbnails/14.jpg)
14
Simple example: Simple example: OnemaxOnemax
OnemaxOnemax
What happens?What happens?Average fitness grows (as predicted by theory)Average fitness grows (as predicted by theory)No context is necessaryNo context is necessaryFitness contributions stay constantFitness contributions stay constant
( ) ∑=
=n
iin XXXXf
121 ,,, K
( )( ) 5.00
5.01−==+==
i
i
XfXf
![Page 15: Fitness inheritance in the Bayesian optimization algorithm](https://reader033.vdocument.in/reader033/viewer/2022052522/55500fd9b4c90535638b4971/html5/thumbnails/15.jpg)
15
ExperimentsExperiments
ProblemsProblems5050--bit bit onemaxonemax10 traps of order 410 traps of order 410 traps of order 510 traps of order 5
SettingsSettingsInheritance proportion from 0 to 0.999Inheritance proportion from 0 to 0.999Minimum population size for reliable convergenceMinimum population size for reliable convergenceMany runs for each setting (300 runs for each setting)Many runs for each setting (300 runs for each setting)
OutputOutputSpeedSpeed--up (in terms of up (in terms of realreal fitness evaluations)fitness evaluations)
![Page 16: Fitness inheritance in the Bayesian optimization algorithm](https://reader033.vdocument.in/reader033/viewer/2022052522/55500fd9b4c90535638b4971/html5/thumbnails/16.jpg)
16
OnemaxOnemax
0 0.2 0.4 0.6 0.8 10
5
10
15
20
25
30
35
Proportion inherited
Spe
ed-u
p (w
.r.t.
no in
herit
ance
)
![Page 17: Fitness inheritance in the Bayesian optimization algorithm](https://reader033.vdocument.in/reader033/viewer/2022052522/55500fd9b4c90535638b4971/html5/thumbnails/17.jpg)
17
TrapTrap--44
0 0.2 0.4 0.6 0.8 10
5
10
15
20
25
30
35
Proportion inherited
Spe
ed-u
p (w
.r.t.
no in
herit
ance
)
![Page 18: Fitness inheritance in the Bayesian optimization algorithm](https://reader033.vdocument.in/reader033/viewer/2022052522/55500fd9b4c90535638b4971/html5/thumbnails/18.jpg)
18
TrapTrap--55
0 0.2 0.4 0.6 0.8 10
10
20
30
40
50
60
Proportion inherited
Spe
ed-u
p (w
.r.t.
no in
herit
ance
)
![Page 19: Fitness inheritance in the Bayesian optimization algorithm](https://reader033.vdocument.in/reader033/viewer/2022052522/55500fd9b4c90535638b4971/html5/thumbnails/19.jpg)
19
DiscussionDiscussion
Inheritance proportionInheritance proportionHigh proportions of inheritance work great.High proportions of inheritance work great.
SpeedSpeed--upupOptimal speedOptimal speed--up of 30up of 30--5353High speedHigh speed--up for almost any settingup for almost any settingThe tougher the problem, the better the speedThe tougher the problem, the better the speed--upup
Why so good?Why so good?Learning probabilistic model difficult, so accurate Learning probabilistic model difficult, so accurate fitness info can be added at not much extra cost.fitness info can be added at not much extra cost.
![Page 20: Fitness inheritance in the Bayesian optimization algorithm](https://reader033.vdocument.in/reader033/viewer/2022052522/55500fd9b4c90535638b4971/html5/thumbnails/20.jpg)
20
ConclusionsConclusions
Fitness inheritance works great in BOAFitness inheritance works great in BOATheory now exists (Theory now exists (SatrySatry et al., 2004) that et al., 2004) that explains these resultsexplains these resultsHigh proportions of inheritance lead to high High proportions of inheritance lead to high speedspeed--upsupsChallenging problems allow much speedChallenging problems allow much speed--upupUseful for practitioners with computationally Useful for practitioners with computationally complex fitness functioncomplex fitness function
![Page 21: Fitness inheritance in the Bayesian optimization algorithm](https://reader033.vdocument.in/reader033/viewer/2022052522/55500fd9b4c90535638b4971/html5/thumbnails/21.jpg)
21
ContactContact
Martin Martin PelikanPelikanDept. of Math and Computer Science, 320 CCBDept. of Math and Computer Science, 320 CCBUniversity of Missouri at St. LouisUniversity of Missouri at St. Louis8001 Natural Bridge Rd.8001 Natural Bridge Rd.St. Louis, MO 63121St. Louis, MO 63121
EE--mail: mail: [email protected]@cs.umsl.edu
WWW:WWW: http://http://www.cs.umsl.edu/~pelikanwww.cs.umsl.edu/~pelikan//