Final ExamArtificial Neural Networks and AIConverging FrameworksVision, AI and ANNsSlide 5Slide 6Slide 7Major Functional AreasInterconnectMore on ConnectivityRemember? Neurons & synapsesElectron Micrograph of a Real NeuronRemember? Transmenbrane Ionic TransportApproaches to neural modelingThe Cable EquationThe Hodgkin-Huxley ModelDetailed Neural ModelingThe "basic" biological neuronWarren McCulloch and Walter Pitts (1943)Excitatory and Inhibitory SynapsesFrom Logical Neurons to Finite AutomataIncreasing the Realism of Neuron ModelsLeaky Integrator NeuronLeaky Integrator ModelHopfield NetworksSlide 26“Energy” of a Neural Networksi: 0 to 1 transitionsi: 1 to 0 transitionMinimizing EnergyAssociative MemoriesAssociative memory with Hopfield netsSelf-Organizing Feature MapsSlide 34Slide 35Applications: ClassificationApplications: ModellingApplications: ForecastingApplications: Novelty DetectionMulti-layer Perceptron ClassifierMulti-layer Perceptron ClassifierClassifiersExample: face recognitionTrainingLearning rateTesting / EvaluationSlide 47Capabilities and Limitations of Layered NetworksSlide 49Slide 50Optimal Network ArchitecturesFor further informationCS 561, Sessions 24-251Final Exam•Thursday, December 11, 8:00am – 10:00am•rooms: pending…•No books, no questions, work alone, everything seen in class.CS 561, Sessions 24-252Artificial Neural Networks and AIArtificial Neural Networks provide…-A new computing paradigm-A technique for developing trainable classifiers, memories, dimension-reducing mappings, etc-A tool to study brain functionCS 561, Sessions 24-253Converging Frameworks•Artificial intelligence (AI): build a “packet of intelligence” into a machine•Cognitive psychology: explain human behavior by interacting processes (schemas) “in the head” but not localized in the brain•Brain Theory: interactions of components of the brain - - computational neuroscience - neurologically constrained-models•and abstracting from them as both Artificial intelligence and Cognitive psychology:- connectionism: networks of trainable “quasi-neurons” to provide “parallel distributed models” little constrained by neurophysiology- abstract (computer program or control system) information processing modelsCS 561, Sessions 24-254Vision, AI and ANNs•1940s: beginning of Artificial Neural Networks McCullogh & Pitts, 1942i wixi Perceptron learning rule (Rosenblatt, 1962)BackpropagationHopfield networks (1982)Kohonen self-organizing maps…CS 561, Sessions 24-255Vision, AI and ANNs1950s: beginning of computer visionAim: give to machines same or better vision capability as oursDrive: AI, robotics applications and factory automationInitially: passive, feedforward, layered and hierarchical processthat was just going to provide input to higher reasoningprocesses (from AI)But soon: realized that could not handle real images1980s: Active vision: make the system more robust by allowing thevision to adapt with the ongoing recognition/interpretationCS 561, Sessions 24-256CS 561, Sessions 24-257CS 561, Sessions 24-258Major Functional Areas•Primary motor: voluntary movement•Primary somatosensory: tactile, pain, pressure, position, temp., mvt.•Motor association: coordination of complex movements•Sensory association: processing of multisensorial information•Prefrontal: planning, emotion, judgement•Speech center (Broca’s area): speech production and articulation•Wernicke’s area: comprehen-•sion of speech•Auditory: hearing•Auditory association: complex•auditory processing•Visual: low-level vision•Visual association: higher-level•visionCS 561, Sessions 24-259Felleman & Van Essen, 1991 InterconnectCS 561, Sessions 24-2510More on ConnectivityWhich brain area is connected to which other one,And in which directions?CS 561, Sessions 24-2511Key terms:AxonDendritesSynapsesSoma (cell body)Remember? Neurons & synapsesCS 561, Sessions 24-2512Electron Micrograph of a Real NeuronCS 561, Sessions 24-2513Remember? Transmenbrane Ionic Transport•Ion channels act as gates that allow or block the flow of specific ions into and out of the cell.CS 561, Sessions 24-2514Approaches to neural modeling•Biologically-realistic, detailed models•E.g., cable equation, multi-compartment models•The Hodgkin-Huxley model•Simulators like NEURON (Yale) or GENESIS (Caltech)•More abstract models, still keeping realism in mind•E.g., integrate & fire model, simple and low detail but preserves spiking behavior•Highly abstract models, neurons as operators•E.g., McCulloch & Pitts model•Classical “neural nets” modelingCS 561, Sessions 24-2515The Cable Equation•See http://diwww.epfl.ch/~gerstner/SPNM/SPNM.htmlfor excellent additional material (some reproduced here).•Just a piece of passive dendrite can yield complicated differential equations which have been extensively studied by electronicians in the context of the study of coaxial cables (TV antenna cable):CS 561, Sessions 24-2516The Hodgkin-Huxley ModelExample spike trains obtained…CS 561, Sessions 24-2517Detailed Neural ModelingA simulator, called “Neuron” has been developed at Yale to simulate the Hodgkin-Huxley equations, as well as other membranes/channels/etc. See http://www.neuron.yale.edu/CS 561, Sessions 24-2518The "basic" biological neuron•The soma and dendrites act as the input surface; the axon carries the outputs. •The tips of the branches of the axon form synapses upon other neurons or upon effectors (though synapses may occur along the branches of an axon as well as the ends). The arrows indicate the direction of "typical" information flow from inputs to outputs.Dendrites Soma Axon with branches andsynaptic terminalsCS 561, Sessions 24-2519•A McCulloch-Pitts neuron operates on a discrete time-scale, t = 0,1,2,3, ... with time tick equal to one refractory period•At each time step, an input or output is on or off — 1 or 0, respectively. •Each connection or synapse from the output of one neuron to the input of another, has an attached weight. Warren McCulloch and Walter Pitts (1943)x (t)1x (t)nx (t)2y(t+1)w12nwwaxonCS 561, Sessions 24-2520Excitatory and Inhibitory Synapses•We call a synapse excitatory if wi > 0, and inhibitory if wi < 0. •We also associate a threshold with each neuron •A neuron fires (i.e., has value 1 on its output line) at
View Full Document