meta- heuristic algorithms ant colony genetic algorithm simulated annealing

Post on 13-Jan-2016

227 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Meta- Heuristic Algorithms

Ant Colony

Genetic Algorithm

Simulated Annealing

Ant Colony Optimization

ACO Concept• Ants (blind) navigate from nest to food source• Shortest path is discovered via pheromone trails

– each ant moves at random– pheromone is deposited on path– ants detect lead ant’s path, inclined to follow– more pheromone on path increases probability of path

being followed

ACO System• Virtual “trail” accumulated on path segments• Starting node selected at random• Path selected at random

– based on amount of “trail” present on possible paths from starting node

– higher probability for paths with more “trail”

• Ant reaches next node, selects next path• Continues until reaches starting node• Finished “tour” is a solution

ACO System, cont.• A completed tour is analyzed for optimality• “Trail” amount adjusted to favor better solutions

– better solutions receive more trail– worse solutions receive less trail– higher probability of ant selecting path that is part of a

better-performing tour

• New cycle is performed• Repeated until most ants select the same tour

on every cycle (convergence to solution)

ACO System, cont.• Often applied to TSP (Travelling Salesman

Problem): shortest path between n nodes• Algorithm in Pseudocode:

– Initialize Trail– Do While (Stopping Criteria Not Satisfied) – Cycle Loop

• Do Until (Each Ant Completes a Tour) – Tour Loop• Local Trail Update• End Do• Analyze Tours• Global Trail Update

– End Do

Background

• Discrete optimization problems difficult to solve

• “Soft computing techniques” developed in past ten years:– Genetic algorithms (GAs)

• based on natural selection and genetics

– Ant Colony Optimization (ACO)• modeling ant colony behavior

Background, cont.

• Developed by Marco Dorigo (Milan, Italy), and others in early 1990s

• Some common applications:– Quadratic assignment problems– Scheduling problems– Dynamic routing problems in networks

• Theoretical analysis difficult– algorithm is based on a series of random decisions

(by artificial ants)– probability of decisions changes on each iteration

Implementation

Ant Algorithms

Ant Algorithms

Implementation

• Can be used for both Static and Dynamic Combinatorial optimization problems

• Convergence is guaranteed, although the speed is unknown– Value– Solution

The Algorithm

• Ant Colony Algorithms are typically use to solve minimum cost problems.

• We may usually have N nodes and A undirected arcs

• There are two working modes for the ants: either forwards or backwards.

• Pheromones are only deposited in backward mode.

The Algorithm

• The ants memory allows them to retrace the path it has followed while searching for the destination node

• Before moving backward on their memorized path, they eliminate any loops from it. While moving backwards, the ants leave pheromones on the arcs they traversed.

The Algorithm

• The ants evaluate the cost of the paths they have traversed.

• The shorter paths will receive a greater deposit of pheromones. An evaporation rule will be tied with the pheromones, which will reduce the chance for poor quality solutions.

The Algorithm

• At the beginning of the search process, a constant amount of pheromone is assigned to all arcs. When located at a node i an ant k uses the pheromone trail to compute the probability of choosing j as the next node:

• where is the neighborhood of ant k when in node i.

0

ki

ij kik

ilij l N

ki

if j Np

if j N

kiN

The Algorithm

• When the arc (i,j) is traversed , the pheromone value changes as follows:

• By using this rule, the probability increases that forthcoming ants will use this arc.

kij ij

The Algorithm

• After each ant k has moved to the next node, the pheromones evaporate by the following equation to all the arcs:

• where is a parameter. An iteration is a completer cycle involving ants’ movement, pheromone evaporation, and pheromone deposit.

(1 ) , ( , )ij ijp i j A

(0,1]p

Steps for Solving a Problem by ACO

1. Represent the problem in the form of sets of components and transitions, or by a set of weighted graphs, on which ants can build solutions

2. Define the meaning of the pheromone trails3. Define the heuristic preference for the ant while

constructing a solution4. If possible implement a efficient local search algorithm

for the problem to be solved.5. Choose a specific ACO algorithm and apply to problem

being solved6. Tune the parameter of the ACO algorithm.

Applications

Efficiently Solves NP hard Problems• Routing

– TSP (Traveling Salesman Problem)– Vehicle Routing– Sequential Ordering

• Assignment– QAP (Quadratic Assignment Problem)– Graph Coloring– Generalized Assignment– Frequency Assignment– University Course Time Scheduling

43

52

1

Applications• Scheduling

– Job Shop– Open Shop– Flow Shop– Total tardiness (weighted/non-weighted)– Project Scheduling– Group Shop

• Subset– Multi-Knapsack– Max Independent Set– Redundancy Allocation– Set Covering– Weight Constrained Graph Tree partition– Arc-weighted L cardinality tree– Maximum Clique

Applications

• Other– Shortest Common Sequence– Constraint Satisfaction– 2D-HP protein folding– Bin Packing

• Machine Learning– Classification Rules– Bayesian networks– Fuzzy systems

• Network Routing– Connection oriented network routing– Connection network routing– Optical network routing

Genetic Algorithms

Part I: GA Theory

What are genetic algorithms?

How to design a genetic algorithm?

Genetic Algorithm Is Not...

Gene coding...

Genetic Algorithm Is...

…Computer algorithm

That resides on principles of genetics and evolution

Instead of Introduction...

•Hill climbing

locallocal

globalglobal

Instead of Introduction…(2)

•Multi-climbers

Instead of Introduction…(3)

•Genetic algorithm

I am not at the top.I am not at the top.My high is better!My high is better!

I am at the I am at the toptop

Height is ...Height is ...

I will continueI will continue

Instead of Introduction…(3)

•Genetic algorithm - few microseconds after

GA Concept• Genetic algorithm (GA) introduces the principle of evolution

and genetics into search among possible solutions to given problem.

• The idea is to simulate the process in natural systems. • This is done by the creation within a machine

of a population of individuals represented by chromosomes, in essence a set of character strings,that are analogous to the DNA,that we have in our own chromosomes.

Survival of the Fittest• The main principle of evolution used in GA

is “survival of the fittest”.• The good solution survive, while bad ones

die.

Nature and GA...

NatureNature Genetic algorithmGenetic algorithm

ChromosomeChromosome StringString

GeneGene CharacterCharacter

LocusLocus String positionString position

GenotypeGenotype PopulationPopulation

PhenotypePhenotype Decoded structureDecoded structure

The History of GA

• Cellular automata – John Holland, university of Michigan, 1975.

• Until the early 80s, the concept was studied theoretically.

• In 80s, the first “real world” GAs were designed.

Algorithmic Phases

Initialize the populationInitialize the population

Select individuals for the mating poolSelect individuals for the mating pool

Perform crossoverPerform crossover

Insert offspring into the populationInsert offspring into the population

The EndThe End

Perform mutationPerform mutation

yesyes

nono

Stop?Stop?

Designing GA... How to represent genomes? How to define the crossover operator? How to define the mutation operator? How to define fitness function? How to generate next generation? How to define stopping criteria?

Representing Genomes...

RepresentationRepresentation ExampleExample

stringstring 1 0 1 1 1 0 0 11 0 1 1 1 0 0 1

array of stringsarray of strings http avala yubc net ~apopovichttp avala yubc net ~apopovic

tree - genetic programmingtree - genetic programming>>

bbxorxor

oror

cc

bbaa

Crossover

• Crossover is concept from genetics.• Crossover is sexual reproduction.• Crossover combines genetic material from

two parents,in order to produce superior offspring.

• Few types of crossover:– One-point– Multiple point.

One-point Crossover

Parent #1Parent #1 Parent #2Parent #2

00

11

55

33

55

44

77

66

77

66

22

44

22

33

00

11

One-point Crossover

Parent #1Parent #1 Parent #2Parent #2

00

11

55

33

55

44

77

66

77

66

22

44

22

33

00

11

Mutation

• Mutation introduces randomness into the population.

• Mutation is asexual reproduction.• The idea of mutation

is to reintroduce divergence into a converging population.

• Mutation is performed on small part of population,in order to avoid entering unstable state.

Mutation...

11 11 00 11 00 1100 00

00 11 00 11 00 1100 11

11 00

00 11

ParentParent

ChildChild

About Probabilities...• Average probability for individual to crossover

is, in most cases, about 80%.• Average probability for individual to mutate

is about 1-2%.• Probability of genetic operators

follow the probability in natural systems.• The better solutions reproduce more often.

Fitness Function

• Fitness function is evaluation function,that determines what solutions are better than others.

• Fitness is computed for each individual.• Fitness function is application depended.

Selection• The selection operation copies a single individual,

probabilistically selected based on fitness, into the next generation of the population.

• There are few possible ways to implement selection:– “Only the strongest survive”

• Choose the individuals with the highest fitness for next generation

– “Some weak solutions survive”• Assign a probability that a particular individual

will be selected for the next generation• More diversity• Some bad solutions might have good parts!

Selection - Survival of The Strongest

0.930.93 0.510.51 0.720.72 0.310.31 0.120.12 0.640.64

Previous generationPrevious generation

Next generationNext generation

0.930.93 0.720.72 0.640.64

Selection - Some Weak Solutions Survive

0.930.93 0.510.51 0.720.72 0.310.31 0.120.12 0.640.64

Previous generationPrevious generation

Next generationNext generation

0.930.93 0.720.72 0.640.64 0.120.12

0.120.12

Mutation and Selection...

Phenotype

D

Phenotype

D

Phenotype

D

SelectionSelection MutationMutation

Solution distributionSolution distribution

Stopping Criteria• Final problem is to decide

when to stop execution of algorithm.• There are two possible solutions

to this problem: – First approach:

• Stop after production of definite number of generations

– Second approach: • Stop when the improvement in average fitness

over two generations is below a threshold

GA Vs. Ad-hoc Algorithms

Genetic AlgorithmGenetic Algorithm Ad-hoc AlgorithmsAd-hoc Algorithms

SpeedSpeed

Human workHuman work

ApplicabilityApplicability

PerformancePerformance

SlowSlow * * Generally fastGenerally fast

MinimalMinimal Long and exhaustiveLong and exhaustive

GeneralGeneralThere are problems There are problems

that cannot be solved analyticallythat cannot be solved analytically

ExcellentExcellent DependsDepends

* Not necessary!* Not necessary!

Problems With Gas

• Sometimes GA is extremely slow, and much slower than usual algorithms

Advantages of Gas• Concept is easy to understand.• Minimum human involvement.• Computer is not learned how to use existing solution,

but to find new solution!• Modular, separate from application• Supports multi-objective optimization• Always an answer; answer gets better with time !!!• Inherently parallel; easily distributed• Many ways to speed up and improve a GA-based

application as knowledge about problem domain is gained• Easy to exploit previous or alternate solutions

GA: An Example - Diophantine Equations

•Diophantine equation (n=4):

A*x + b*y + c*z + d*q = s

•For given a, b, c, d, and s - find x, y, z, q

•Genome:

)X, y, z, p = (

xx yy zz qq

GA:An Example - Diophantine Equations(2)

•Crossover

•Mutation

( 1, 2, 3, 4 )( 1, 2, 3, 4 )

( 5, 6, 7, 8 )( 5, 6, 7, 8 )

( 1, 6, 3, 4 )( 1, 6, 3, 4 )

( 5, 2, 7, 8 )( 5, 2, 7, 8 )

( 1, 2, 3, 4 )( 1, 2, 3, 4 ) ( 1, 2, 3, 9 )( 1, 2, 3, 9 )

GA:An Example - Diophantine Equations(3)• First generation is randomly generated of numbers

lower than sum (s).• Fitness is defined as absolute value of difference

between total and given sum:

Fitness = abs ( total - sum ) ,

• Algorithm enters a loop in which operators are performed on genomes: crossover, mutation, selection.

• After number of generation a solution is reached.

Some Applications of Gas

GAGAInternet searchInternet search

Data miningData mining

Software guided circuit designSoftware guided circuit designControl systems designControl systems design

Stock prize predictionStock prize prediction

Path findingPath finding Mobile robotsMobile robotssearchsearch

OptimizationOptimization

Trend spottingTrend spotting

Part II: Applications of GAs

GA and the internet

GA and image segmentation

GA and system design

Simulated Annealing

Simulated Annealing(1)

• What is annealing?• Process of slowly cooling down a

compound or a substance• Slow cooling let the substance flow around thermodynamic equilibrium

• Molecules get obtimum conformation

contraction : cause stress

Simulated Annealing(2)

• What is simulated annealing?

– Stochastic optimization method– Simulates slow cooling of annealing process– Implemented by Metropolis algorithm, Monte Carlo

step– Analogy

• Current thermodynamic state = solution• Energy equation = objective function• Ground state = global optimum• Temperature T = ???

Simulated Annealing(3)

• Charateristics

– Take some upnhill steps to escape the local minimum

– Instead of picking the best move, it picks a random move

– If the move improves the situation, it is executed.

Otherwise, move with some probability less than 1.

Simulated Annealing(4)

• Simple Iterative Algorithm1. find a path p

2. make p’, a variation of p

3. if p’ is better than p, keep p’ as p

4. goto 2

• Metropolis Algorithm– 3’ : if (p’ is better than p) or ( within Prob), then keep

p’ as p

• Simulated Annealing– T is reduced to 0 by schedule as time passes

Simulated Annealing(5)• Algorithm

function Simulated-Annealing(problem, schedule) returns a solution stateinputs: problem, a problemlocal variables: current, a node

next, a nodeT, a “temperature” controlling the probability of downward

steps

current Make-Node(Initial-State[problem])for t1 to infinity do

T schedule[t]if T=0 then return currentnext a randomly selected successor of currentDE Value[next] – Value[current]if DE>0 then currentnextelse currentnext only with probability eDE/T

Simulated Annealing(6)

• Schedule • Determines rate at which the temperature is

lowered • Lowers T slowly enough, the algorithm will

find a global optimum

• Temperature T : control parameter• Used to determine the probability• High T : large changes • Low T : small changes

Simulated Annealing(7)

p

E0

1

eE/T

• Probability distribution

Simulated Annealing(8)

moves

T0

Tfn

T(n)

• Schedule example

Simulated Annealing(9)

higher probability of escaping local maxima

Little chance of escaping local maxima, but local maxima may be good enough in practical problmes.

Simulated Annealing(10)

• To avoid of entrainment in local minima

– Anealing schedule : by trial and error• Choice of initial temperature• How many iterations are performed at each

temperature• How much the temperature is decremented

at each step as cooling proceeds

Simulated Annealing(11)

• Difficulties

– Determination of parameters• Schedule T• # of moves at each temperature

– If cooling is too slow• Too much time to get solution

– If cooling is too rapid• Solution may not be the global optimum

Simulated Annealing(12)

• Application

– VLSI layout

– Signal and image processing

– Task allocation

– Network design

– Molecular analysis

Application in CSP

• Can be solved by IIA– Allow states with unsatisfied constraints operators reassign

variable values

• Heuristic repair method– Assigning values to all the variables– Applying modification operators which simply assign a

different value to a variable

• Min-conflicts heuristic– Choose value that violates the fewest constraints i.e., hillclimb with h(n) = total number of violated constraints– Surprisingly effective (ex:million-queens in less than 50

steps)

Min-conflicts heuristics

• Example : 4-Queens– States : 4 queens in 4 columns(44 = 256

states)– Operators : move queen in column– Goal test : no attacks– Evaluation : h(n) = number of attacks

h = 5 h = 2 h = 0

Summary

• Iterative improvement algorithms – Keep only single state in memory – Can get stuck on local maxima– Simulated annealing to escape local maxima

• Complete, optimal given a long enough cooling schedule

• For CSP – Variable and value ordering heuristics can provide

huge performance gains – Current algorithms often solve very large problems

very quickly

top related