universal wireless rear speaker kit - rocketfish products
Post on 03-Feb-2022
4 Views
Preview:
TRANSCRIPT
Practical Guidelines for Using EvolutionaryAlgorithms
Darrell WhitleyColorado State AI Lab
Colorado State University
CEC Edinburgh 2005 –1
A Sample Set of Evolutionary Algorithms
• Simple Genetic Algorithm: Holland/Goldberg
• Genitor, Steady-State GAs: Whitley
• CHC: Eshelman
• Evolution Strategies: Schwefel/Rechenburg
• CMA Evolution Strategies:Hansen, Ostermeier
• Parallel Genetic Algorithms
– Island Model Genetic Algorithms
– Cellular Genetic Algorithms
• Other Algorithms:
– Tabu Search
– Pattern Search, Mesh Adaptive Direct Search
CEC Edinburgh 2005 –2
The Simple Genetic Algorithm (with Elitism)
RECOMBINATION: Let the following two binary strings represent anencoding of 5 parameters in a parameter optimization problem.
01011 100111101011011 011010001010101
10010 101101100011101 010100101000010
0101110 0111101 0110110 1101000 1010101
10010 010100101000010100111101011011
Which Produces the Offspring
01011 011010001010101101101100011101
1001010 1101100 0111010 1010010 1000010
CEC Edinburgh 2005 –3
SIMPLE GENETIC ALGORITHM MODEL
Selection RecombinationCrossover &
Time tintermediate
Time t Time t+1
String 1String 2String 3
String 5String 4
String 5String 3String 2String 2String 1
String nString n
(1&2)(1&2)
(2&3)(2&3)
Child 2Child 1 Child 2Child 1
. . .
.
.
. ... .
.
.
. . .. . .. . .. . .
. . .
CEC Edinburgh 2005 –4
5
1
2
37
8
6 4
k
1 2 3 4 5 6 7 8
1 2 3 4 5 6 7 8
Universal Stochastic Sampling, Roulette Wheel Selection
CEC Edinburgh 2005 –5
.
.
.
.
.
.
PickBestof 2
intermediateTime tTime t
TOURNAMENT SELECTION
CEC Edinburgh 2005 –6
Tournament Selection with Variance Reduction
Assume population size = K.Let πk(i) be the ith element of a random permutation πk.
For i = 1 to K
Compare populaton member ith against member πk(i).Keep the best.
Tournaments
Every individual is in 1 vs 3
EXACTLY 2 Tournaments 2 vs 7
3 vs 6
4 vs 5
5 vs 1
6 vs 2
7 vs 4
CEC Edinburgh 2005 –7
Genitor: A “Steady State” GA
• Rank Based Selection
• Two Point Crossover with Reduced Surrogates
• Randomly Choose One Offspring, Mutate
• Insert and Displace Worst
CHC
• Population-elitist selection: Truncation Selection
• Incest Prevention
• HUX
• Restarts
CEC Edinburgh 2005 –8
Selection &RecombinationCrossover
String 1String 2String 3
.
.
.
. . .
Parent 1Parent 2
Child 1Child 2
String 5String 4Child 2
String n-1
String 1String 2String 3
String 5String 4
String n
.
.
.
. . .
. . .
Time t Time t+1
in PopulationInsert
GENITOR MODEL
CEC Edinburgh 2005 –9
.
.
.
Child 1Child 2Child 3Child 4
Child n
viaRandomSelectionand HUX
.
.
.
Child lChild k
Best n strings in
P(t) + C(t)
P(t) C(t)
P(t+1)
.
.
.
Parent n
Parent 4Parent 3Parent 2Parent 1
Parent i
Parent j
Parent m
CHC MODEL
CEC Edinburgh 2005 –10
1 0 1 1 0 1 0 0 1 0
1 1 0 0 0 1 1 0 0 1
1 0 1 1 0 1 1 0 0 1
1 1 0 0 0 1 0 0 1 0
1 1 0 0 0 1 1 0 0 1
1 0 1 1 0 1 0 0 1 0 1 1 1 0 0 1 1 0 1 0
1 0 0 1 0 1 0 0 0 1
- 1 0 0 - - 1 - 0 1
- 0 1 1 - - 0 - 1 0
HUX
UNIFORM CROSSOVER
ONE POINT CROSSOVER
1 0 0 1 0 1 1 0 0 0
1 1 1 0 0 1 0 0 1 1
CEC Edinburgh 2005 –11
EVOLUTION STRATEGIES
• Uses Real-Valued Parameter Representation
• (µ, λ)-selection:λ Offspring replace µ Parents
• (µ + λ)-selection:Truncation Selection
• Self Adaptive Mutation and Rotation
• Blending Recombination
Note when λ > µ we generate extra offspring, then reduce back to µ.
CEC Edinburgh 2005 –12
Simple Mutations Correlated Mutation via Rotation
CEC Edinburgh 2005 –13
CMA Covariance Matrix Adaptation
Let Z(g+1) be the covariance of the µ best individuals.
Let P(g+1) be the covariance of the evolution path.
The new covariance matrix is:
C(g+1) = (1 − ccov)C(g) + ccov
(
αcovP(g+1) + (1 − αcov)Z(g+1))
Where ccov and αcov are constants that weight each input.
CEC Edinburgh 2005 –14
−2 −1.5 −1 −0.5 0 0.5 1 1.5 2−2
−1.5
−1
−0.5
0
0.5
1
1.5
2
CEC Edinburgh 2005 –15
Parallel Island Model Genetic Algorithms
ISLAND MODEL WITH MIGRATION
CEC Edinburgh 2005 –16
CEC Edinburgh 2005 –17
Parallel Cellular Genetic Algorithm
CELLULAR GENETIC ALGORITHM MODEL
CEC Edinburgh 2005 –18
Local Quad Search
N P N’a b c c’ b’ a’ x Y z
Q1 Q2 Q3 Q4
“Quad Search” uses only 4 neighbors, and evaluates only 2.On unimodal functions it is proven to converge to optimal
in less than 2L evaluations.
CEC Edinburgh 2005 –19
A Hybrid: Genetic Quad Search
We used GENITOR, a steady-state GA, as the genetic algorithm. We usedQuad Search to improve each member of the population.
Function Algorithm Mean σ Solved Evals
CMA-ES -388.0 15.0 0 500K
Quad -434.8 8.4 0 500K
Rana 10-D Genitor -443.4 17.8 0 500K
CHC -495.5 5.5 0 500K
Hybrid Quad -510.3 2.5 26 268K
CEC Edinburgh 2005 –20
Ruffled by Ridges:How Evolutionary Algorithms Can Fail
• Direction – coordinate search cannot see improving points that fallbetween axis.
• Precision – increasing precision generally decreases the number of falseoptima.
200 220 240 260 280 300
200
220
240
260
280
300
CEC Edinburgh 2005 –21
The Temperature Inversion Problem
Researchers have created a forward model that relates 43 vertical temperatureprofiles (~x) to 2,000 observed measurements (~y).
• model(~x) −→ ~y
• An analytical inversion of this model is impossible.
• Formulate as an optimization problem:
f(~x) = (~yobs − model(~x))T (~yobs − model(~x))
• Sometimes first order derivatives can be calculated analytically.In the general case, this is impossible.
CEC Edinburgh 2005 –22
Why is the temperature problem so hard?
• Ridges in search space.
200
220
240
260
280
200200
300
200200 220220 220220 240240 240240 260260 260260 280280 280280 300300 300300
200
220
240
260
280
300
0 10 20 30 40
CEC Edinburgh 2005 –23
0 10 20 30 40
200
220
240
260
280
300
CMA − 162,405 evaluations
0 10 20 30 40
200
220
240
260
280
300
CHC − 436,212 evaluations
CEC Edinburgh 2005 –24
Gray vs Binary vs Real
0001
0011001001100111
1001
10111010
1000
0100
1100
11101111
1101
0101
345
6 7
89
12
15 14
1
2
10 11 13
0000
0
4−bit Gray Encoding
0001
0011001001100111
1001
10111010
1000
0100
1100
11101111
1101
0101
1
2 367
9
1014
12
15
4
11
5
13 8
0000
0
4−bit Binary Encoding
There are good arguments for Gray codes over Binary.But Gray codes are “blind” to ridges.
Sometimes Binary is better.
CEC Edinburgh 2005 –25
Gray vs Binary vs Real
Comparing “Real-Valued” and “Bit” representation is much more complexthan most of the literature suggests.
Genetic algorithms at 20 bits of precision can be 10 to 100 times slower toconverge using 20 versus 10 bits of precision.
Low Precision might ”miss” good solutions.But it aids exploration.
High Precision can result in low/slow exploration.
CEC Edinburgh 2005 –26
The Testing Problem
1. Test Functions can be TOO EASYE.G. ONEMAX, Sphere Functions
2. Test Functions can be TOO HARDE.G. Random Job Shop Problems, N-K Landscapes
3. Test Functions can be UNREALISTICE.G. Deceptive Functions and Trap Functions(These have theoretical value, but ....)
4. Test Functions can be TOO SPECIALIZEDE.G. MAXSAT: Too many flat plateaus
There are no easy answers.
CEC Edinburgh 2005 –27
Parameter Optimization and Test Problems
There are many common test functions. Not all are good test functions.Many are linearly separable:
F (x, y, z) = G(x) + G(y) + G(z)
Pairwise combinations can be used to introduce greater nonlinearity:
F (x, y, z) = G(x, y) + G(y, z) + G(z, x)
where G(x, y) is a nonlinear function of x and y.
Test functions can also be rotated to createnonlinearity.
CEC Edinburgh 2005 –28
020
4060
Rosenbrock
020
4060
020
4060
020
4060
020
4060
020
4060
020
4060
020
4060
020
4060
020
4060
020
4060
CHC10
CHC20
LS10
LS20
CMA200
CMA500
MADS2n
MADSn+1
GPS2n
GPSn+1
Unifn+1
020
4060
80
Rosenbrock Rot
020
4060
800
2040
6080
020
4060
800
2040
6080
020
4060
800
2040
6080
020
4060
800
2040
6080
020
4060
800
2040
6080
100
CHC10
CHC20
LS10
LS20
CMA200
CMA500
MADS2n
MADSn+1
GPS2n
GPSn+1
Unifn+1
Figure 1: The midline=median; the gray box represents 50 percent of evalua-tions. The bars show max and min values, except for outliers (small circles).
CEC Edinburgh 2005 –29
010
030
050
0 f8f2 20
010
030
050
00
100
300
500
010
030
050
00
100
300
500
010
030
050
00
100
300
500
010
030
050
00
100
300
500
010
030
050
00
100
300
500
CHC10
CHC20
LS10
LS20
CMA200
CMA500
MADS2n
MADSn+1
GPS2n
GPSn+1
Unifn+1
100
300
500
f8f2 Rot
100
300
500
100
300
500
100
300
500
100
300
500
100
300
500
100
300
500
100
300
500
100
300
500
100
300
500
100
300
500
CHC10
CHC20
LS10
LS20
CMA200
CMA500
MADS2n
MADSn+1
GPS2n
GPSn+1
Unifn+1
Figure 2: The midline=median; the gray box represents 50 percent of evalua-tions. The bars show max and min values, except for outliers (small circles).
CEC Edinburgh 2005 –30
−16000 −10000 −4000
f101
−16000 −10000−16000 −10000−16000 −10000−16000 −10000−16000 −10000−16000 −10000−16000 −10000−16000 −10000−16000 −10000−16000 −10000
CH
C10
CH
C20
LS10LS20
CM
A200
CM
A500
MA
DS
2nM
AD
Sn+
1G
PS
2nG
PS
n+1
Unif
n+1
−12000 −8000
f101 Ro
t
−12000 −8000−12000 −8000−12000 −8000−12000 −8000−12000 −8000−12000 −8000−12000 −8000−12000 −8000−12000 −8000−12000 −8000
CH
C10
CH
C20
LS10LS20
CM
A200
CM
A500
MA
DS
2nM
AD
Sn+
1G
PS
2nG
PS
n+1
Unif
n+1
Figure3:The
midline=m
edian;thegray
boxrepresents50
percentofevalua-tions.The
barsshowm
axand
min
values,exceptforoutliers(smallcircles).
CECEdinburgh
2005–31
−9000 −7000 −5000 −3000
Ran
a
−9000 −7000 −5000−9000 −7000 −5000−9000 −7000 −5000−9000 −7000 −5000−9000 −7000 −5000−9000 −7000 −5000−9000 −7000 −5000−9000 −7000 −5000−9000 −7000 −5000−9000 −7000 −5000
CH
C10
CH
C20
LS10LS20
CM
A200
CM
A500
MA
DS
2nM
AD
Sn+
1G
PS
2nG
PS
n+1
Unif
n+1
−8000 −6000 −4000
Ran
a Ro
t
−8000 −6000 −4000−8000 −6000 −4000−8000 −6000 −4000−8000 −6000 −4000−8000 −6000 −4000−8000 −6000 −4000−8000 −6000 −4000−8000 −6000 −4000−8000 −6000 −4000−8000 −6000 −4000
CH
C10
CH
C20
LS10LS20
CM
A200
CM
A500
MA
DS
2nM
AD
Sn+
1G
PS
2nG
PS
n+1
Unif
n+1
Figure4:The
midline=m
edian;thegray
boxrepresents50
percentofevalua-tions.The
barsshowm
axand
min
values,exceptforoutliers(smallcircles).
CECEdinburgh
2005–32
NFL: No Free Lunch
All search algorithms are equivalent when compared over all possible discrete
functions. Wolpert, Macready (1995)
Consider any algorithm Ai applied to function fj .
On(Ai, fj) outputs the order in which Ai visits the elements in the codomainof fj . Resampling is ignored. For every pair of algorithms Ak and Ai and for
any function fj , there exist a function fl such that
On(Ai, fj) ≡ On(Ak, fl)
Consider a “BestFirst” versus a “WorstFirst” local search with restarts. Forevery j there exists an l such that
On(BestF irst, fj) ≡ On(WorstF irst, fl)
CEC Edinburgh 2005 –33
Theorem:NFL holds for a set of functions IFF
the set of functions form a permutation set.
The “Permutation Set” is the closure of a setof functions with respect to a permutation operator.
(Schmacher, Vose and Whitley–GECCO 2001).
F1: A B C F1: 0 0 0 1
F2: A C B F2: 0 0 1 0
F3: B A C F3: 0 1 0 0
F4: B C A F4: 1 0 0 0
F5: C A B
F6: C B A
CEC Edinburgh 2005 –34
POSSIBLE POSSIBLE
ALGORITHMS FUNCTIONS
A1: 1 2 3 F1: A B C
A2: 1 3 2 F2: A C B
A3: 2 1 3 F3: B A C
A4: 2 3 1 F4: B C A
A5: 3 1 2 F5: C A B
A6: 3 2 1 F6: C B A
CEC Edinburgh 2005 –35
QUESTION:
How should we evaluate search algorithms?
Let β represent a set of benchmarks.P (β) is the permutation closure over β.
If algorithm S is better than algorithm T on β...Then T is better than S on P (β) − β.
This is True in the aggregate, but not on average.
CEC Edinburgh 2005 –36
EXPERIMENT
Let’s evolve Lisp Programs ala Genetic Programming.But let’s try different evolutionary algorithms.
The GAs use mutation and tree-based crossover. The ESs use mutation only.
We will evolve
• 11-Multiplexer
• Pole Balancing with Cart Centering
• Symbolic Regression
We will compare
• A Generational GA, Popsize = 1000, Tournament Size 2 and 7
• A Steady-State GA, Popsize = 1000, Tournament Size 2 and 7
• Evolution Strategies: (1+1)ES, (50+250)ES, (100,500)ES, (200,1000)ES
CEC Edinburgh 2005 –37
Pole Balancing
0 10 20 30 40 50
0.2
0.4
0.6
0.8
1.0
Evaluations (x 1000)
Bes
t fitn
ess
gen_t2_1000 gen_t7_1000 ss_t2_1000 ss_t7_1000
0 10 20 30 40 50
0.2
0.4
0.6
0.8
1.0
Evaluations (x 1000)
Bes
t fitn
ess
es1+10 es50+250 es100+500 es200+1000
CEC Edinburgh 2005 –38
Symbolic Regression: x6− 2x4 + x2
0 20 40 60 80 100
0.0
0.5
1.0
1.5
2.0
2.5
3.0
Evaluations (x 1000)
Bes
t fitn
ess
gen_t2_1000 gen_t7_1000 ss_t2_1000 ss_t7_1000
0 20 40 60 80 100
0.0
0.5
1.0
1.5
2.0
2.5
3.0
Evaluations (x 1000)B
est f
itnes
s
es1+10 es50+250 es100+500 es200+1000
CEC Edinburgh 2005 –39
The 11 Multiplexer
0 20 40 60 80 100
200
400
600
800
Evaluations (x 1000)
Bes
t fitn
ess
gen_t2_1000 gen_t7_1000 ss_t2_1000 ss_t7_1000
0 20 40 60 80 100
200
400
600
800
Evaluations (x 1000)
Bes
t fitn
ess
es1+10 es50+250 es100+500 es200+1000
CEC Edinburgh 2005 –40
top related