what is a hyper-heuristic? - university of stirling - single point metaheuristics.pdfoptima by...
Post on 24-Mar-2020
1 Views
Preview:
TRANSCRIPT
26/01/2017
1
(ITNPDB8) Evolutionary and Heuristic Optimisation
Lecture 3: Single Point Methods Gabriela Ochoa http://www.cs.stir.ac.uk/~goc/
Computing Science and Mathematics,
School of Natural Sciences
University of Stirling, Stirling, Scotland
Outline
1. Optimisation methods
– Heuristics and metaheuristcis
– Single point (trajectory) algorithms
– Population-based (evolutionary) algorithms
Gabriela Ochoa, goc@stir.ac.uk 2
Video for TSP: https://www.youtube.com/watch?v=SC5CX8drAtU
26/01/2017
2
Optimisation problems
• Definition (minimisation problem): couple (S,f)
– Given • a search space S,feasible solutions
• an objective function f: S R
– Find s* in S such that • f(s*)≤ f(s) for all s in S
• s* is the global optimum
• Large scale/complex optimisation problems in many areas of science and industry: Transportation, Logistics, Finance, Design, Biology, Environment
Gabriela Ochoa, goc@stir.ac.uk 3
Classical representations
Gabriela Ochoa, goc@stir.ac.uk 4
Metaheuristics: From Design to Implementation By El-Ghazali Talbi (2009)
26/01/2017
3
Towards effective search techniques
• Effective search techniques provide a mechanism to balance exploration and exploitation – exploiting the best solutions found so far
– exploring the search space
• Extreme examples: – Hill-climbing methods exploit the best available solution
for possible improvement but neglect exploring a large portion of the search space
– Random search (points in the search space are sampled with equal probability) explores the search space thoroughly but misses exploiting promising regions.
Gabriela Ochoa, goc@stir.ac.uk 5
Towards effective search techniques
• Aim is to design search algorithms that can – escape local optima
– balance exploration and exploitation
– make the search independent from the initial configuration
• Successful single point search algorithms – Iterated local search
– Simulated annealing
– Tabu search
– Other methods
Gabriela Ochoa, goc@stir.ac.uk 6
26/01/2017
4
Hill climbing algorithm (First Improvement, random mutation)
Gabriela Ochoa, goc@stir.ac.uk 7
procedure hill-climbing
begin
s = random initial solution solution
repeat
evaluate solution (s)
s’ = random neighbour of s
if evaluation(s’) is better than evaluation(s)
s = s’
until stopping-criteria satisfied
return s
end
Hill climbing algorithm (Best Improvement)
Gabriela Ochoa, goc@stir.ac.uk 8
procedure best-hill-climbing
begin
s = random initial solution solution
repeat
evaluate solution (s)
s’ = best solution in the Neighbourhood
if evaluation(s’) is better than evaluation(s)
s = s’
until s is a local optimum
return s
end
26/01/2017
5
Hill-climbing example
Gabriela Ochoa, goc@stir.ac.uk 9
Binary representation, 1-flip move operator, best-improvement.
Multiple restarts hill-climbing • Simple strategy to improve hill-climbing
• Have an additional loop, repeat hill-climbing for a number of times from a different starting solution
Gabriela Ochoa, goc@stir.ac.uk 10
procedure Multiple restart hill-climbing
begin
s = randomly generated solution
best = s
repeat
s = hill_climbing(s)
if evaluation(s) is better than evaluation(best)
best = s
s = randomly generated solution
until stopping-criteria satisfied
return best
end
26/01/2017
6
Iterated local search
Procedure Iterated Local Search (ILS) determine initial candidate solution s Perform local search on s while NOT termination_criterion { r = s perform perturbation on s perform local search on s based on acceptance criterion keep s or revert to s = r }
• Key idea: use two stages – Subsidiary local search for efficiently reaching local optima (intensification)
– Perturbation stage, for effectively escaping local optima (diversification)
• Acceptance criterion: to control diversification vs. intensificaction
Key idea rediscovered several times with different names (80s &90s). Term iterated local search proposed HR Lourenço, OC Martin, T Stützle(2003). Iterated local search. Handbook of metaheuristics, 320-353, Springer . Google cites: 964
Gabriela Ochoa, goc@stir.ac.uk 11
Iterated Local Search • Simple strategy to improve hill-climbing
• Have an additional loop, repeat hill-climbing for a number of times from a perturbed solution
Gabriela Ochoa, goc@stir.ac.uk 12
procedure Iterated local search
begin
s = randomly generated solution
s = hill-climbing(s)
best = s
repeat
s = perturb(s)
s = hill-climbing(s)
if evaluation(s) is better than evaluation(best)
best = s
until stopping-criteria satisfied
return best
end
26/01/2017
7
Move operators Binary representation
Hill-climbing stage
• 1-flip: Solutions generated by flipping a single bit in the given bit string
– Example: 1 1 0 0 1 → 0 1 0 0 1,
Perturbation: a stronger move
• 2-flip: Solutions generated by flipping two bits in the given bit string
– Example: 1 1 0 0 1 → 0 1 0 1 1
• k-flip: can be generalised for larger k, k < n
Gabriela Ochoa, goc@stir.ac.uk 13
Travelling salesman
14 Gabriela Ochoa,
goc@cs.stir.ac.uk
Name Cities Description Optimum Year Concorde (secs)
lin318 318 Drilling problem 42029 1980 9.74
att532 532 Cities in the US 27686 1982 109.52
Example instances
26/01/2017
8
Gabriela Ochoa, goc@stir.ac.uk 15
TSP operators
Chained Lin-Kernighan
• An iterated local search approach
– Perturbation kick: double-bridge move (form of 4-exchange)
– Local search: Lin-Kernighan local search heuristic
• Lin-Kernighan heuristic (1973)
– Based k-exchanges: remove k different links and reconnect them a
new way to achieve a legal tour. – LK applies 2, 3 and higher-order k-exchanges
– Adaptive, k is increased until a stopping criterion is met
16 O. C Martin, Otto, Felten. 1992. Large-Step Markov Chains for the TSP Incorporating Local Search Heuristics. Operations Research Letters 11: 219–24.
Double-bridge move
26/01/2017
9
TSP State-of-the-art • TSP home and Concorde Solver, U. Waterloo, Canada
• TSPLIB, Gerhard Reinelt, Universität Heidelberg
• Lin-Kernighan Helsgaun LKH
• World TSP images and animations
17
Progress in Solving TSP instances (log scale) • 1954 Dantzig, Fulkerson, Johnson
(cutting-plane): 49 • 1971 Held, Karp: 57, 64 • 1975 Miliotis: 80 • 1977 Grotschel: 12 • 1980 Crowder, Padberg: 318 • 1987 Padberg, Rinaldi: 2,392 • 1990 > Concorde (Applegate, Bixby,
Chvátal, Cook)
Simulated annealing • Key idea: provides a mechanism to escape local
optima by allowing moves that worsen the objective function value
• Annealing: the physical process of heating up a solid and then cooling it down (slowly) until it crystallizes – candidate solutions → states of physical system
– objective function → thermodynamic energy
– globally optimal solutions → ground states
– parameter T → physical temperature
Gabriela Ochoa, goc@stir.ac.uk
Kirkpatrick, S., Gelatt, C. D., & Vecchi, M. (1983). Optimization by simulated annealing. Science, 220, 671–680.
Google Scholar citations: 31,477
18
26/01/2017
10
Simulated annealing – Algorithm
1. Start with a random solution s
2. Choose some “nearby” solution s’
3. If the new solution is better (i.e. f(s’) ≤ f(s)) , take it as the current solution (= accept it)
4. If it is worse, accept it with a probability that depends on the deterioration f(s)-f(s’) and a global parameter T (the temperature)
Metropolis acceptance criterion
Cooling schedule: a mechanism for reducing the temperature
Gabriela Ochoa, goc@stir.ac.uk 19
Simulated annealing – pseudo-code
procedure simulated annealing
begin
create an initial solution
repeat
evaluate solution
if accepted then update current solution
if time to change temperature then
decrease temperature
if stopping-criteria NOT satisfied then
generate new solution
until stopping-criteria satisfied
end
26/01/2017
11
Simulated annealing - design Four choices to make:
1. Representation of solutions
2. Definition of cost function
3. Neighbourhood
4. Cooling schedule: four parameters must be specified • initial temperature,
• temperature decrement rule,
• number of iterations to be performed at each temperature,
• stopping criteria
Simplest and common temperature decrement rule: Ti+1 = αTi Where α is a constant close to, but smaller than, 1. This exponential cooling scheme (ECS) was first proposed Kirkpatrick et al. α = 0.95
Tabu search
• It behaves like a best-improvement Hill-Climbing algorithm
• But it accepts non-improving (worse) solutions in order to escape from local optima
– When all the neighbouring solutions are non-improving (worse or equal)
• Deterministic algorithm
Gabriela Ochoa, goc@stir.ac.uk 22
26/01/2017
12
Tabu search
Procedure Tabu Search (TS) determine initial candidate solution s while NOT termination criterion { determine set N’ of non-tabu neighbours of s choose a best improving candidate solution s’ in N’ update Tabu attributes based on s’ s := s’ }
• Associate Tabu attributes with candidate solutions or solution components. Maintains a list of prohibitions.
• Forbid steps to search positions recently visited based on Tabu attributes (characteristics of the moves)
F. Glover (1989). Tabu Search - Part 1. ORSA Journal on Computing 1 (2): 190–206. Google cites: 5,675 F. Glover (1990). Tabu Search - Part 2. ORSA Journal on Computing 2 (1): 4-32. Google cites: 3,684 R. Battiti, G. Tecchiolli (1994) The reactive tabu search . ORSA journal on computing 6 (2): 126-140.
The word 'tabu' comes from Tongan, a language of Polynesia, used by the locals to indicate things that cannot be touched because they are sacred.
Gabriela Ochoa, goc@stir.ac.uk 23
Escaping from local optima
Gabriela Ochoa, goc@stir.ac.uk 24
26/01/2017
13
Summary
• Aim is to design search algorithms that can – escape local optima – balance exploration and exploitation – make the search independent from the initial
configuration
• Successful single point search algorithms – Multiple restart hill-climbing – Iterated local search – Simulated annealing – Tabu search – Other methods
Gabriela Ochoa, goc@stir.ac.uk 25
Next Lecture: Population-based, Evolutionary Algorithms
top related