DOC PREVIEW
CORNELL CS 472 - Knowledge-Based Systems

This preview shows page 1 out of 2 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 2 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 2 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Knowledge-Based SystemsCS472/CS473 – Fall 2005History of AI 1943 – 1969 The Beginnings1943 McCulloch and Pitts show networks of neurons can compute and learn any function1950 Shannon (1950) and Turing (1953) wrote chess programs1951 Minsky and Edmonds build the first neural network computer (SNARC)1956 Dartmouth Conference – Newell and Simon brought a reasoning program “The Logic Theorist” which proved theorems. 1952 Samuel’s checkers player1958 McCarthy designed LISP, helped invent time-sharing and created Advice Taker (a domain independent reasoning system)1960’s Microworlds – solving limited problems: SAINT (1963), ANALOGY (1968), STUDENT (1967), blocksworld invented. 1962 Perceptron Convergence Theorem is proved.Example ANALOGY ProblemHistory of AI1966 – 1974 Recognizing Lack of Knowledge• Herb Simon (1957): Computer chess program will be world chess champion within 10 years.• Intractable problems, lack of computing power (LighthillReport, 1973)• Machine translation• Limitations in knowledge representation (Minsky and Papert, 1969)Î Knowledge-poor programsKnowledge Representation• Human intelligence relies on a lot of background knowledge – the more you know, the easier many tasks become– ”knowledge is power”– E.g. SEND + MORE = MONEY puzzle.• Natural language understanding– Time flies like an arrow.– Fruit flies like a banana.– The spirit is willing but the flesh is weak. (English)– The vodka is good but the meat is rotten. (Russian)• Or: Plan a trip to L.A. Knowledge-Based Systems/Agents• Key components:– Knowledge base: a set of sentences expressed in some knowledge representation language– Inference/reasoning mechanisms to query what is known and to derive new information or make decisions. • Natural candidate: – logical language (propositional/first-order) – combined with a logical inference mechanism • How close to human thought?– In any case, appears reasonable strategy for machines.Example: Autonomous CarState: k-tuple(PersonInFrontOfCar, Policeman, Policecar, Slippery, YellowLight, RedLight) Actions:Brake, Accelerate, TurnLeft, etc.Knowledge-base describing when the car should brake?( PersonInFrontOfCar ⇒ Brake )((( YellowLight ∧ Policeman ) ∧ ( ¬Slippery )) ⇒ Brake )( Policecar ⇒ Policeman )( Snow ⇒ Slippery )( Slippery ⇒ ¬Dry )( RedLight ⇒ Brake )Logic as a Knowledge Representation• Components of a Formal Logic:–syntax– semantics (link to the world)– logical reasoning• entailment: α ² βif, in every model in which α is true, β is also true.– inference algorithm•KB ` α, i.e., α is derived from KB• should be sound and completeSoundness and CompletenessSoundness:An inference algorithm that derives only entailed sentences is called sound or truth-preserving.KB ` α implies KB ² αCompleteness:An inference algorithm is complete if it can derive any sentence that is entailed.KB ² α implies KB ` αWhy soundness and completeness important?Æ Allow computer to ignore semantics and “just push symbols”!Propositional Logic: Syntax• Propositional Symbols– A, B, C, …• Connectives– ∧, ∨ , ¬, ⇒, ⇔• Sentences– Atomic Sentence: True, False, Propositional Symbol– Complex Sentence:•( ¬ Sentence )• ( Sentence ∨ Sentence )• ( Sentence ∧ Sentence )• ( Sentence ⇒ Sentence )• ( Sentence ⇔ Sentence )Example: Autonomous CarKnowledge-base describing when the car should brake?( PersonInFrontOfCar ⇒ Brake )∧ ((( YellowLight ∧ Policeman ) ∧ ( ¬Slippery )) ⇒ Brake ) ∧ ( Policecar ⇒ Policeman )∧ ( Snow ⇒ Slippery )∧ ( Slippery ⇒ ¬Dry )∧ ( RedLight ⇒ Brake )Observation from sensors:YellowLight∧ ¬RedLight∧ ¬Snow ∧ Dry ∧ Policecar∧ ¬PersonInFrontOfCarPropositional Logic: Semantics• Model: – Assignment of truth values to Symbols – Example: m={P=True , Q=False}• Note: Often called “assignment” instead of “model”, and “model” is used for an assignment that evaluates to true.• Entailment: – α ² β if and only if, in every model in which α is true, β is also true.• Model Checking:– To test whether α entails β, enumerate all models and check truth of α and


View Full Document
Download Knowledge-Based Systems
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Knowledge-Based Systems and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Knowledge-Based Systems 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?