Unformatted text preview:

Genetic Algorithms• Evolutionary computation• Prototypical GA• An example: GABIL• Genetic Programming• Individual learning and population evolution1Evoluationary Computation1. Computational procedures patterned after biologicalevolution2. Search procedure that probabilistically applies searchoperators to set of points in the search space2Biological EvolutionLamarck and others:• Species “transmute” over timeDarwin and Wallace:• Consistent, heritable variation among individuals inpopulation• Natural selection of the fittestMendel and genetics:• A mechanism for inheriting traits• genotype → phenotype mapping3GA(F itness, F itness threshold, p, r, m)• Initialize: P ← p random hypotheses• Evaluate: for each h in P , compute F itness(h)• While [maxhF itness(h)] < F itness threshold1. Select: Probabilistically select (1 − r)p membersof P to add to PS.Pr(hi)=F itness(hi)pj=1F itness(hj)2. Crossover: Probabilistically selectr·p2pairs ofhypotheses from P . For each pair, h1,h2,produce two offspring by applying the Crossoveroperator. Add all offspring to Ps.3. Mutate: Invert a randomly selected bit in m · prandom members of Ps4. Update: P ← Ps5. Evaluate: for each h in P , compute F itness(h)• Return the hypothesis from P that has the highestfitness.4Representing HypothesesRepresent(Outlook = Overcast ∨ Rain) ∧ (W ind = Strong)byOutlook W ind011 10RepresentIF W ind = Strong THEN P layT ennis = yesbyOutlook W ind P layT ennis111 10 105Operators for Genetic AlgorithmsSingle-point crossover:11101001000000010101011111100000011101010101Initial strings Crossover Mask OffspringTwo-point crossover:1110100100000001010101001111100001100101100010011010011Uniform crossover:Point mutation:11101001000000010101011000100010011101001000 111010110000010100010100001001000011010110016Selecting Most Fit HypothesesFitness proportionate selection:Pr(hi)=F itness(hi)pj=1F itness(hj)... can lead to crowdingTournament selection:• Pick h1,h2at random with uniform prob.• With probability p, select the more fit.Rank selection:• Sort all hypotheses by fitness• Prob of selection is proportional to rank7GABIL [DeJong et al. 1993]Learn disjunctive set of propositional rules, competitivewith C4.5Fitness:F itness(h)=(correct(h))2Representation:IF a1= T ∧a2= F THEN c = T ;IFa2= T THEN c = Frepresented bya1a2ca1a2c10 01 1 11 10 0Genetic operators: ???• want variable length rule sets• want only well-formed bitstring hypotheses8Crossover with Variable-Length Bit-stringsStart witha1a2ca1a2ch1:10011 11100h2:01110 100101. choose crossover points for h1, e.g., after bits 1, 82. now restrict points in h2to those that producebitstrings with well-defined semantics, e.g., 1, 3, 1, 8, 6, 8.if we choose 1, 3, result isa1a2ch3:11100a1a2ca1a2ca1a2ch4:00011 11110 100109GABIL ExtensionsAdd new genetic operators, also appliedprobabilistically:1. AddAlternative: generalize constraint on aibychanginga0to12. DropCondition: generalize constraint on aibychanging every 0 to 1And, add new field to bitstring to determine whether toallow thesea1a2ca1a2cAADC01 11 0 10 01 0 1 0So now the learning strategy also evolves!10GABIL ResultsPerformance of GABIL comparable to symbolicrule/tree learning methods C4.5, ID5R, AQ14Average performance on a set of 12 synthetic problems:• GABIL without AA and DC operators: 92.1%accuracy• GABIL with AA and DC operators: 95.2%accuracy• symbolic learning methods ranged from 91.2 to 96.611SchemasHow to characterize evolution of population in GA?Schema = string containing 0, 1, * (“don’t care”)• Typical schema: 10**0*• Instances of above schema: 101101, 100000, ...Characterize population by number of instancesrepresenting each possible schema• m(s, t) = number of instances of schema s in pop attime t12Consider Just Selection•¯f(t) = average fitness of pop. at time t• m(s, t) = instances of schema s in pop at time t• ˆu(s, t) = ave. fitness of instances of s at time tProbability of selecting h in one selection stepPr(h)=f(h)ni=1f(hi)=f(h)n¯f(t)Probabilty of selecting an instance of s in one stepPr(h ∈ s)=h∈s∩ptf(h)n¯f(t)=ˆu(s, t)n¯f(t)m(s, t)Expected number of instances of s after n selectionsE[m(s, t + 1)] =ˆu(s, t)¯f(t)m(s, t)13Schema TheoremE[m(s, t +1)] ≥ˆu(s, t)¯f(t)m(s, t)⎛⎜⎜⎝1 − pcd(s)l − 1⎞⎟⎟⎠(1 −pm)o(s)• m(s, t) = instances of schema s in pop at time t•¯f(t) = average fitness of pop. at time t• ˆu(s, t) = ave. fitness of instances of s at time t• pc= probability of single point crossover operator• pm= probability of mutation operator• l = length of single bit strings• o(s) number of defined (non “*”) bits in s• d(s) = distance between leftmost, rightmost definedbits in s14Genetic ProgrammingPopulation of programs represented by treessin(x)+x2+ y^sinxy2+x+15Crossover^sinxy2 +x+^sinxy2+x+sinxy+x+^sinxy2+x+^216Block Problemuiv anerslGoal: spell UNIVERSALTerminals:• CS (“current stack”) = name of the top block onstack, or F .• TB (“top correct block”) = name of topmost correctblockonstack• NN (“next necessary”) = name of the next blockneeded above TB in the stack17Block Problem (cont.)Primitive functions:• (MS x): (“move to stack”), if block x is on the table,moves x to the top of the stack and returns the valueT . Otherwise, does nothing and returns the value F .• (MT x): (“move to table”), if block x is somewherein the stack, moves the block at the top of the stackto the table and returns the value T . Otherwise,returns F .• (EQ xy): (“equal”), returns T if x equals y, andreturns F otherwise.• (NOT x): returns T if x = F , else returns F• (DU xy): (“do until”) executes the expression xrepeatedly until expression y returns the value T18Learned ProgramTrained to fit 166 test problemsUsing population of 300 programs, found this after 10generations:(EQ (DU (MT CS)(NOT CS)) (DU (MS NN)(NOT NN)) )19Genetic ProgrammingMore interesting example: design electronic filter circuits• Individuals are programs that transform beginingcircuit to final circuit, by adding/subtractingcomponents and connections• Use population of 640,000, run on 64 node parallelprocessor• Discovers circuits competitive with best humandesigns20GP for Classifying Objects in Images[Teller and Veloso, 1997]Fitness: based on coverage and accuracyRepresentation:• Primitives include Add, Sub, Mult, Div, Not, Max,Min,


View Full Document

WSU CSE 6363 - Genetic Algorithms

Download Genetic Algorithms
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Genetic Algorithms and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Genetic Algorithms 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?