DOC PREVIEW
CORNELL CS 472 - Foundations of Artificial Intelligence

This preview shows page 1-2 out of 6 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 6 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 6 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 6 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

1Foundations of Artificial IntelligenceLocal SearchCS472 – Fall 2007Filip RadlinskiScaling Up• So far, we have considered methods that systematically explore the full search space, possibly using principled pruning (A* etc.).• The current best such algorithms (RBFS / SMA*) can handle search spaces of up to 10100states → ~ 500 binary valued variables.• But search spaces for some real-world problems might be much bigger - e.g. 1030,000 states.• Here, a completely different kind of search is needed. → Local Search MethodsExampleQuickTime™ and aTIFF (LZW) decompressorare needed to see this picture.QuickTime™ and aTIFF (LZW) decompressorare needed to see this picture.Optimization Problems• We're interested in the Goal State - not in how to get there.• Optimization Problem:- State: vector of variables- Objective Function: f: state →- Goal: find state that maximizes or minimizes the objective function• Examples: VLSI layout, job scheduling, map coloring, N-Queens.ℜExample Local Search Methods• Applicable to optimization problems.• Basic idea:- use a single current state- don't save paths followed- generally move only to successors/neighbors of that state• Generally require a complete state description.2Hill-Climbing Searchfunction HILL-CLIMBING ( ) returns a solution state inputs: , a problem static: , a node MAKE-NODE(INITIAL-STATE[ ])loop do a highest-valued succeproblemproblemcurrentcurrent problemnext←← ssor of if VALUE[next] < VALUE[current] then return endcurrentcurrentcurrent next←Current StateEvaluationHill Climbing PathologiesObjective functionShoulderGlobal MaximumLocal Maximum“flat” local maximumState SpaceValue of current solutionLocal Maximum ExampleImprovements to Basic Local SearchIssue: How to move more quickly to successively higher plateaus and avoid getting “stuck” local maxima.Idea: Introduce downhill moves (“noise”) to escape from long plateaus (or true local maxima).Strategies:• Random-restart hill-climbing=> Multiple runs from randomly generated initial states•Tabusearch• Simulated Annealing• Genetic AlgorithmsVariations on Hill-ClimbingRandom restarts: Simply restart at a new random state after a pre-defined number of steps.Local Beam Search: Run the random starting points in parallel, always keeping the k most promising statescurrent ← k initial statesfor t ← 1 to infinity donew ← expand every state in currentif f(best-in-new) < f(best-in-current) then return best-in-currentcurrent ← best k states in new3Ridges Simulated AnnealingIdea:Use conventional hill-climbing techniques, butoccasionally take a step in a direction other than that in whichthe rate of change is maximal.As time passes, the probability that a down-hill step is taken isgradually reduced and the size of any down-hill step taken isdecreased.Kirkpatrick et al. 1982; Metropolis et al.1953.Simulated Annealing Algorithmcurrent ← initial statefor t ← 1 to infinity doT ← schedule[t]if T = 0 then return currentnext ← randomly selected successor of current∆E ← f(next) – f(current)if ∆E > 0 then current ← nextelse current ← next only with probability e∆E/TGenetic Algorithms• Approach mimics evolution.• Usually presented using a rich (and different) vocabulary:fitness, populations, individuals, genes, crossover, mutations, etc.• Still, can be viewed quite directly in terms of standard local search.Genetic AlgorithmsInspired by biological processes that produce genetic change inpopulations of individuals.Genetic algorithms (GAs) are local search procedures that usually the following basic elements:• A Darwinian notion of fitness: the most fit individuals have the best chance of survival and reproduction.• “Crossover” operators:- Parents are selected.- Parents pass their genetic material to children.• Mutation: individuals are subject to random changes in their genetic material.Features of Evolution• High degree of parallelism (many individuals in a population)• New individuals (“next state / neighboring states”):Derived by combining “parents” (“crossover operation”) Random changes also happen (“mutations”)• Selection of next generation:Based on survival of the fittest: the most fit parents tend to be used to generate new individuals.4General Idea• Maintain a population of individuals (states / strings /candidate solutions)• Each individual is evaluated using a fitness function, i.e. an objective function. The fitness scores force individuals to compete for the privilege of survival and reproduction.• Generate a sequence of generations:- From the current generation, select pairs of individuals(based on fitness) to generate new individuals, using crossover.• Introduce some noise through random mutations.• Hope that average and maximum fitness (i.e. value to be optimized) increases over time.GA: High-level Algorithm2474855224415124325432133275241124752411327524112441512432752411327521242441541124748552247524113274855223 29%20 26%11 14%24 31%(b)Fitness Function(c)Selection(d)Cross-Over(e)Mutation(a)Initial Population327481523225212424415417Genetic algorithms as search• Genetic algorithms are local heuristic search algorithms.• Especially good for problems that have large and poorly understood search spaces.• Genetic algorithms use a randomized parallel beam search to explore the state space.• You must be able to define a good fitness function, and ofcourse, a good state representation.GA (Fitness, Fitness_threshold,p,r,m)•P← randomly generate p individuals• For each i in P, compute Fitness(i)• While [maxiFitness(i)] < Fitness_threshold1. Probabilistically select (1-r)p members of P to add to PS.2. Probabilistically choose (r·p)/2 pairs of individuals from P. For each pair, apply crossover and add the offspring to PS3. Mutate m·p random members of PS4. P ← PS5. For each i in P, compute Fitness(i)• Return the individual in P with the highest fitness.12,iiSelecting Most Fit Individuals1()Pr( )()pjjFitness iiFittness i==ΣIndividuals are chosen probabilistically for survival and crossover based on fitness proportionate selection:Other selection methods include:• Tournament Selection: 2 individuals selected at random. With probabiltiy p, the more fit of the two is selected. With probability (1-p), the less fit is selected.• Rank Selection: The individuals are sorted by fitness and the probability of selecting an


View Full Document

CORNELL CS 472 - Foundations of Artificial Intelligence

Download Foundations of Artificial Intelligence
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Foundations of Artificial Intelligence and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Foundations of Artificial Intelligence 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?