1Local SearchSearch so far..• A*, BFS, DFS etc– Given set of states, get to goal state– Need to know the path as wellExample: n-queens• Put n queens on an n × n board with no two queens onthe same row, column, or diagonal• How would you represent the state space of thisproblem?• How is the problem different from the 8-puzzle?Local search algorithms• In many optimization problems, the path to thegoal is irrelevant; the goal state itself is thesolution• State space = set of "complete" configurations• Find configuration satisfying constraints, e.g., n-queens• In such cases, we can use local searchalgorithms• Keep a single "current" state, try to improve it– Assume access to a function, Eval(x) that tells you how good Xis2Hill-climbing search• "Like climbing Everest in thick fog withamnesia"Hill-climbing search• Problem: depending on initial state, canget stuck in local maximaHill-climbing search: 8-queens problem• h = number of pairs of queens that are attacking each other, either directlyor indirectly• h = 17 for the above stateHill-climbing search: 8-queens problem• A local minimum with h = 13Simulated annealing search• Idea: escape local maxima by allowing some"bad" moves but gradually decrease theirfrequencyProperties of simulatedannealing search• One can prove: If T decreases slowly enough,then simulated annealing search will find aglobal optimum with probability approaching 1• Widely used in VLSI layout, airline scheduling,etcLocal beam search• Keep track of k states rather than just one• Start with k randomly generated states• At each iteration, all the successors of all kstates are generated• If any one is a goal state, stop; else select the kbest successors from the complete list andrepeat.Genetic algorithms• Variant of local beam search with sexual recombination.4Genetic Algorithms• View optimization by analogy with evolutionarytheory Simulation of natural selection• View configurations as individuals in apopulation• View Eval as a measure of fitness• Let the least-fit individuals die off withoutreproducing• Allow individuals to reproduce with the best-fitones selected more often• Each generation should be overall better fit(higher value of Eval) than the previous one• If we wait long enough the population shouldevolve so toward individuals with high fitness(i.e., maximum of Eval)Genetic Algorithms: Implementation• Configurations represented by strings: X =• Analogy:– The string is the chromosome representing the individual– String made up of genes– Configuration of genes are passed on to offsprings– Configurations of genes that contribute to high fitness tend tosurvive in the population• Start with a random population of P configurations andapply two operations– Reproduction: Choose 2 “parents” and produce 2 “offsprings”– Mutation: Choose a random entry in one (randomly selected)configuration and change it10011001Genetic Algorithms: Reproduction1001100110001101Parents:Genetic Algorithms: Reproduction10011001100011011001100110001101Parents:Select random crossover point:5Genetic Algorithms: Reproduction• Each offspring receives part of the genesfrom each of the parents• Implemented by crossover operation100110011000110110011001100011011000100110011101Parents:Select random crossover point:Offspring:Genetic Algorithms: Mutation• Random change of one element in oneconfigurationImplements random deviations from inherited traitsCorresponds loosely to “random walk”: Introducerandom moves to avoid small local extrema10011001100010011001111110011111 10011011Select a random individualSelect a random entry Change that entryBasic GA Outline• Create initial population X = {X1,..,XP}• Iterate:1. Select K random pairs of parents (X,X’)2. For each pair of parents (X,X’):1.1 Generate offsprings (Y1,Y2) using crossoveroperation1.2 For each offspring Yi: Replace randomly selected element of thepopulation by Yi With probability µ:Apply a random mutation to Yi• Return the best individual in the populationStopping condition is not obvious?Possible strategy:Select the best rPindividuals (r < 1) forreproduction anddiscard the rest Implements selection ofthe fittestVariation:Generate onlyone offspringGenetic Algorithms: Selection• Discard the least-fit individuals through threshold onEval or fixed percentage of population• Select best-fit (larger Eval) parents in priority• Example: Random selection of individual based onthe probability distribution• Example (tournament): Select a random small subsetof the population and select the best-fit individual as aparent• Implements “survival of the fittest”• Corresponds loosely to the greedy part ofhill–climbing (we try to move uphill)!"=populationYYEvalXEvalX)()()selectedindividualPr(6GA and Hill Climbing• Create initial population X = {X1,..,XP}• Iterate:1. Select K random pairs of parents (X,X’)2. For each pair of parents (X,X’):1.1 Generate offsprings (Y1,Y2) using crossoveroperation1.2 For each offspring Yi: Replace randomly selected element of thepopulation by Yi With probability µ:Apply a random mutation to Yi• Return the best individual in the populationHill-climbing component: Try tomove uphill as much as possibleRandom walkcomponent: Moverandomly to escapeshallow local maximaHow would you set up theseproblems to use GA search?3675214LLLECAEDCEDBDCACBA!¬!¬¬!¬!¬¬!!!!¬!¬!TSPSATN-QueensGA for N-queensTSP ExampleGenerationCostMinimum costAverage cost in populationOptimal solution reached atgeneration 35N = 13P = 100 elements inpopulationµ = 4% mutation rater = 50% reproduction rate(K = rP)7Initial populationBest rN elements inpopulation candidate forreproductionBest (lowestcost) element inpopulationPopulation at generation 15Population at generation 35Another TSP ExampleCostMinimum costAverage cost in populationStabilizes atgeneration 23Converges and remains stableafter generation 230.4% difference:GA = 11.801SA = 11.751But: Number of operations(number of cost evaluations) muchsmaller for GA (approx. 2500)8Population at generation 40GA Discussion• Many parameters to tweak: µ, P, r• Many variations on basic scheme. Examples:– Multiple-point crossover– Dynamic encoding– Selection based on rank or relative fitness to least fitindividual– Multiple fitness functions– Combine with a local optimizer (for example, local hill-climbing) Deviates from “pure”
View Full Document