Selection and Recombination Temi avanzati di Intelligenza Artificiale - Lecture 4
Prof. Vincenzo Cutello
Department of Mathematics and Computer Science University of Catania
Selection and Recombination - Lecture 4 2
Selection and Recombination 1. Review of the previous lecture
(a) How a simple evolutionary algorithm works (b) Some crossover and mutation operators
2. Selection and reproduction 3. Recombination 4. A sample algorithm
Selection and Recombination - Lecture 4 3
Selection and Reproduction Selection is not normally regarded as a search operator
although it does in search significantly. Selection can be used either before or after search operators. When selection is used before search operators, the process
of choosing the next generation from the union of all parents and offspring is sometimes called reproduction.
The generational gap of an EA refers to the overlap (i.e., individuals that did not go through any search operators) between the old and new generations.
The two extremes are generational EAs and steady-state EAs.
Elitism = copying the best individual to the next generation.
Selection and Recombination - Lecture 4 4
Fitness Proportional Selection Also known as roulette wheel selection. Use raw fitness in computing selection probabilities.
Does not allow negative fitness values. Weakness: Domination of super individuals in early
generations and slow convergence in later generations.
Fitness scaling has often been used in early days to combat the two problems.
Selection and Recombination - Lecture 4 5
Fitness Scaling Simple scaling
The ith individual's fitness is defined as: f scaled (t) = f original (t) - f worst (t);
where t is the generation number and f worst (t) the fitness of the worst individual so far.
Sigma scaling The ith individual's fitness is defined as:
where c is a constant, e.g., 2, f*(t) is the average fitness in the current population, and f (t) is the standard deviation of the fitness in the current population.
otherwise,0
0)( if)),()(()()(
* tftctftftf scaledforiginal
scaled
Selection and Recombination - Lecture 4 6
Power scaling The ith individual's fitness is defined as:
fscaled(t) = (foriginal(t)) k
where k > 0. Exponential scaling
The ith individual's fitness is defined as: fscaled (t) = exp( foriginal (t)/T )
where T > 0 is the temperature, approaching zero.
Selection and Recombination - Lecture 4 7
Ranking Linear ranking
Assume that the population size is , rank 0 indicates the worst individual and rank -1 the best. The probability of selecting the ith individual is:
Where () indicates the expected number of offspring produced by the worst (best) individuals. Note that:
Hence + = 2. So 1 2 and = 2- . Power ranking
C below is a normalising factor. 0< < .
))](1/()([)(
irankiPlinear
1
0
1)(
ilinear iP
CirankiP
k
power)()]1/()([)(
Selection and Recombination - Lecture 4 8
Geometric ranking:
Exponential ranking:
Xin Yao ranking:
CiP
irank
geom
)(1)1()(
CeiP
irank )(
exp1)(
1
0
)1(
1)(
j
XinYao
j
iiP
Selection and Recombination - Lecture 4 9
Recombination for Real-valued Representation Discrete Recombination
does not change actual (gene) values. Very similar to the crossover operators on binary strings.
Intermediate Recombination does change actual (gene) values. Usually based on some
kind of average/mixture among multiple parents.
Selection and Recombination - Lecture 4 10
Discrete Recombination Multi-point Recombination
Similar to that for the binary representation. Global Discrete Recombination
Similar to uniform crossover for the binary representation.
Geometric explanation.
Selection and Recombination - Lecture 4 11
Intermediate Recombination With two parents
Given x1 and x2:
With more parents Given x1 and x2:
where
]1,0[,)1( 21 iii xxx
...2211 iii xxx
i
i 1
Selection and Recombination - Lecture 4 12
Other Recombination Heuristic Recombination
Assume x2 is no worse than x1:
where u is a uniformly distributed random number in [0,1]. Simplex Recombination
Randomly select a group (> 2) of parents. Assume x1 is the best individual and x2 the worst in the group. Compute the centroid, c, of the group without x2 .
Geometric Recombination Can be generalised to multiple parents.
212 )( xxxx
)( 2xccx
,...),( 22122111 xxxxx
Selection and Recombination - Lecture 4 13
Quadratic Recombination Let xij be the j-th component of the vectors xi, i 1,2,3, j
1,…,n, where n is the dimensionality. We approximate the position of P4 using the quadratic interpolation method as follow.
One offspring is generated from three parents.
)()()()()()()()()()()()(
21
3,2,12,1,31,3,2
32,2
2,12
2,1
2,31
2,3
2,2
,4 xfxxxfxxxfxxxfxxxfxxxfxx
xjjjjjj
jjjjjjj
Selection and Recombination - Lecture 4 14
What Does Quadratic Recombination Mean
Note that we are minimising “fitness” here
Selection and Recombination - Lecture 4 15
A Hybrid EA With Local Search 1. Initialize individuals at random. 2. Perform local search from each individual. 3. REPEAT
(a) Generate 3 points P1 , P2 , P3 by global discrete recombination. (b) Perform a quadratic approximation using P1 , P2 , P3 to produce a
point P4 . (c) Perform a local search from P4 and update P4 with the search
result. (Sounds Lamarkian.) (d) Place P1 , P2 , P3 , P4 into the population and do a ( + 4)
truncation selection. 4. UNTIL termination criteria are met.
Selection and Recombination - Lecture 4 16
Local Search With Random Memorising Store best solutions in a memory. Retrieve a random one (old best) when a new best solution is
found. Search along the direction of oldnew, i.e., the direction of:
oldnew
oldnew
xxxxs
:
Selection and Recombination - Lecture 4 17
Experimental Studies 18 multimodal benchmark functions. Population size 30 Maximum function evaluation 500,000. 50 independent runs for each function.
Selection and Recombination - Lecture 4 18
Benchmark Functions (f8 - f25 )
Selection and Recombination - Lecture 4 19
Results on f8-f13
Selection and Recombination - Lecture 4 20
Results on f8-f13 with population size 50 and 60
Selection and Recombination - Lecture 4 21
Results on f14-f25
Selection and Recombination - Lecture 4 22
Results on f16-f23 with population size 10
Selection and Recombination - Lecture 4 23
Summary Different problems require different search
operators and selection schemes. There is no universally best one.
Using real vectors is usually more appropriate than binary strings for function optimisation.
Many search operators are heuristics-based. Domain knowledge can often be incorporated into search operators and representation.
Selection and Recombination - Lecture 4 24
References T. Bäck, D. B. Fogel, and Z. Michalewicz (eds.),
Handbook of Evolutionary Computation, IOP Publ. Co. & Oxford University Press, 1997. Part C. (in the School library)
K.-H. Liang, X. Yao and C. S. Newton, “Combining landscape approximation and local search in global optimization" Proc. of the 1999 Congress on Evolutionary Computation, Vol. 2, IEEE Press, Piscataway, NJ, USA, pp.1514-1520, July 1999.