DOC PREVIEW
Berkeley COMPSCI 182 - Section Notes

This preview shows page 1-2-3-4-5-6 out of 19 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 19 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 19 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 19 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 19 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 19 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 19 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 19 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

CS 182 Sections 101 - 104AnnouncementsWhere we standQuiz!Two ways of looking at memory:Stroop effect‘Word superiority effect’Eye-tracking ExperimentSlide 9Slide 10LTP and Hebb’s RuleWhy is Hebb’s rule incomplete?The McCullough-Pitts NeuronLet’s try an example: the OR functionMany answers would work“They all rose”How we can model the triangle node with McCullough-Pitts Neurons?Anonymous Feedback: LecturesAnonymous Feedback: SectionsCS 182Sections 101 - 104Created by Eva MokModified by JGM 2/2/05Q: What did the hippocampus say during itsretirement speech? A: “Thanks for the memories”Q: What happens when a neurotransmitter falls in love with a receptor? A: You get a binding relationship.Q: What did the Hollywood film director say after he finished making a movie about myelin? A: “That’s a wrap!”http://faculty.washington.edu/chudler/jokes.htmlAnnouncements•a2 is out, due next Monday 11:59pm–play with tlearn–you can either run it on inst machines or download it and run on your pc (though this may give you some headaches…)•Quiz on ThursdayWhere we stand•Last Week–Basic idea of learning, Hebb’s rule–Psycholinguistics experiments•This Week–Spreading Activation, triangle nodes–Connectionist representations•Coming up–Backprop (review your Calculus!)Quiz!•What are does the Stroop effect show? What was the point of the eye-tracking experiment?•Why is Hebb’s rule not the complete story for the learning that goes on in the brain?•What’s a McCullough-Pitts neuron? How does it work?•What does the “They all rose” experiment show? How can you explain the results computationally?Declarative Non-DeclarativeEpisodic Semantic ProceduralMemoryTwo ways of looking at memory:facts about a situationgeneral facts skillsStroop effect•takes longer to say what color a word is printed in if it names a different color•suggests interaction of form and meaning (as opposed to an encapsulated ‘language module’)‘Word superiority effect’•it’s easier to remember letters if they are seen in the context of a word•militates against ‘bottom-up’ model, where word recognition is built up from letters•suggestion: there are top-down and bottom-up processes which interactEye-tracking Experiment•Three hypothesis for eye-tracking results:–Cohort theory–Neighborhood activation model–TRACE (McClelland & Elman)MemoryShort Term Memory Long Term MemoryTwo ways of looking at memory:electrical changesstructural changesLTPA X P TG Q N LW R V S•Hebb’s Rule: neurons that fire together wire together•Long Term Potentiation (LTP) is the biological basis of Hebb’s Rule•Calcium channels is the key mechanismLTP and Hebb’s RulestrengthenweakenWhy is Hebb’s rule incomplete?•here’s a contrived example:•should you “punish” all the connections?tastebud tastes rotten eats food gets sickdrinks waterThe McCullough-Pitts Neuronyj: output from unit jWij: weight on connection from j to ixi: weighted sum of input to unit ixifyjwijyixi = ∑j wij yjyi = f(xi)ti : targetLet’s try an example: the OR function•Assume you have a threshold function centered at the origin•What should you set w01, w02 and w0b to be so that you can get the right answers for y0?i1i2y00 0 00 1 11 0 11 1 1x0fi1w01y0i2b=1w02w0bMany answers would worky = f (w01i1 + w02i2 + w0bb)recall the threshold functionthe separation happens when w01i1 + w02i2 + w0bb = 0move things around and you geti2 = - (w01/w02)i1 - (w0bb/w02)i2i1“They all rose”triangle nodes:when two of the neurons fire, the third also firesmodel of spreading activationHow we can model the triangle node with McCullough-Pitts Neurons?B CAA B CAnonymous Feedback: Lecturesfeel free to comment on each instructor seperatelyHow is the pace? Do you find the material interesting? too dense? too slow?What will be most helpful to you in getting the most out of lectures?Any particularly confusing topic?Anonymous Feedback: SectionsHave sections been useful? Any feedback on our styles of presentation?How will sections be most helpful to you?Any other


View Full Document

Berkeley COMPSCI 182 - Section Notes

Documents in this Course
Load more
Download Section Notes
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Section Notes and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Section Notes 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?