Unformatted text preview:

CAP6938 Neuroevolution and Developmental Encoding Basic ConceptsWe Care About Evolving Complexity So Why Neural Networks?How Do NNs Work?How do NNs Work? ExampleWhat Exactly Happens Inside the Network?Recurrent ConnectionsActivating Networks of Arbitrary TopologyArbitrary Topology Activation ControversyThe Big QuestionsProblem DimensionalityHigh Dimensional Space is Hard to SearchBad NewsA Solution (preview)So how do computers optimize those weights anyway?Significant Weight Optimization TechniquesCAP6938Neuroevolution and Developmental EncodingBasic ConceptsDr. Kenneth StanleyAugust 23, 2006We Care About Evolving ComplexitySo Why Neural Networks?•Historical origin of ideas in evolving complexity•Representative of a broad class of structures•Illustrative of general challenges•Clear beneficiary of high complexityHow Do NNs Work?InputOutputInputOutputHow do NNs Work?ExampleInputs (Sensors)Outputs (effectors/controls)Front Left Right BackForward Left RightWhat Exactly Happens Inside the Network?•Network ActivationX1X2H1H2Neuron j activation:niijijwxH1out1out2w11w21w12w22•Recurrent connections are backward connections in the network•They allow feedback•Recurrence is a type of memoryRecurrent ConnectionsX1X2Houtw21w11wH-outWout-HRecurrent connectionActivating Networks of Arbitrary Topology•Standard method makes no distinction between feedforward and recurrent connections: •The network is then usually activated once per time tick•The number of activations per tick can be thought of as the speed of thought•Thinking fast is expensiveniijtitjjwxHH1)1()(,X1X2Houtw21w11wH-outWout-HArbitrary Topology Activation Controversy•The standard method is not necessarily the best•It allows “delay-line” memory and a very simple activation algorithm with no special case for recurrence•However, “all-at-once” activation utilizes the entire net in each tick with no extra cost•This issue is unsettledThe Big Questions•What is the topology that works?•What are the weights that work??????????????Problem Dimensionality•Each connection (weight) in the network is a dimension in a search space•The space you’re in matters: Optimization is not the only issue!•Topology defines the space21-dimensional space 3-dimensional spaceHigh Dimensional Space is Hard to Search•3 dimensional – easy•100 dimensional – need a good optimization method•10,000 dimensional – very hard•1,000,000 dimensional – very very hard•100,000,000,000,000 dim. – forget itBad News•Most interesting solutions are high-D:–Robotic Maid–World Champion Go Player–Autonomous Automobile–Human-level AI–Great Composer•We need to get into high-D spaceA Solution (preview)•Complexification: Instead of searching directly in the space of the solution, start in a smaller, related space, and build up to the solution•Complexification is inherent in vast examples of social and biological progressSo how do computers optimize those weights anyway?•Depends on the type of problem–Supervised: Learn from input/output examples–Reinforcement Learning: Sparse feedback–Self-Organization: No teacher•In general, the more feedback you get, the easier the learning problem•Humans learn language without supervisionSignificant Weight Optimization Techniques•Backpropagation: Change weights based on their contribution to error•Hebbian learning: Changes weights based on firing correlations between connected neuronsHomework:-Fausett pp. 39-80 (in Chapter 2)-and Fausett pp. 289-316 (in Chapter 6)-Online intro chaper on RL-Optional RL


View Full Document

UCF CAP 6938 - Neuroevolution and Developmental Encoding

Download Neuroevolution and Developmental Encoding
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Neuroevolution and Developmental Encoding and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Neuroevolution and Developmental Encoding 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?