Unformatted text preview:

1UncertaintyUncertaintyRussell and Norvig: Chapter 13CMCS424 Fall 2003based on material from Jean-ClaudeLatombe, Daphne Koller and Nir FriedmanenvironmentUncertain AgentUncertain Agentagent?sensorsactuators?????modelAn Old Problem …An Old Problem …Types of UncertaintyTypes of UncertaintyUncertainty in prior knowledgeE.g., some causes of a disease are unknown and are not represented in the background knowledge of a medical-assistant agentTypes of UncertaintyTypes of UncertaintyUncertainty in prior knowledgeE.g., some causes of a disease are unknown and are not represented in the background knowledge of a medical-assistant agentUncertainty in actionsE.g., actions are represented with relatively short lists of preconditions, while these lists are in fact arbitrary longFor example, to drive my car in the morning:• It must not have been stolen during the night• It must not have flat tires• There must be gas in the tank• The battery must not be dead• The ignition must work• I must not have lost the car keys• No truck should obstruct the driveway• I must not have suddenly become blind or paralyticEtc…Not only would it not be possible to list all of them, but would trying to do so be efficient?Types of UncertaintyTypes of UncertaintyUncertainty in prior knowledgeE.g., some causes of a disease are unknown and are not represented in the background knowledge of a medical-assistant agentUncertainty in actions E.g., actions are represented with relatively short lists of preconditions, while these lists are in fact arbitrary longUncertainty in perceptionE.g., sensors do not return exact or complete information about the world; a robot never knows exactly its positionCourtesy R. Chatila2Types of UncertaintyTypes of UncertaintyUncertainty in prior knowledgeE.g., some causes of a disease are unknown and are not represented in the background knowledge of a medical-assistant agentUncertainty in actions E.g., actions are represented with relatively short lists of preconditions, while these lists are in fact arbitrary longUncertainty in perceptionE.g., sensors do not return exact or complete information about the world; a robot never knows exactly its positionSources of uncertainty:1. Ignorance2. Laziness (efficiency?)What we call uncertainty is a summary of all that is not explicitly taken into account in the agent’s KBWhat we call uncertainty is a summary of all that is not explicitly taken into account in the agent’s KBQuestionsQuestionsHow to represent uncertainty in knowledge?How to perform inferences with uncertain knowledge?Which action to choose under uncertainty?How do we deal with uncertainty?Implicit:  Ignore what you are uncertain of when you can Build procedures that are robust to uncertaintyExplicit: Build a model of the world that describe uncertainty about its state, dynamics, and observations Reason about the effect of actions given the modelHandling UncertaintyHandling UncertaintyApproaches:1. Default reasoning2. Worst-case reasoning3. Probabilistic reasoningDefault ReasoningDefault ReasoningCreed: The world is fairly normal. Abnormalities are rareSo, an agent assumes normality, until there is evidence of the contraryE.g., if an agent sees a bird x, it assumes that x can fly, unless it has evidence that x is a penguin, an ostrich, a dead bird, a bird with broken wings, …Representation in LogicRepresentation in LogicBIRD(x) ∧¬ABF(x) ⇒ FLIES(x)PENGUINS(x) ⇒ ABF(x)BROKEN-WINGS(x) ⇒ ABF(x)BIRD(Tweety)…Default rule: Unless ABF(Tweety) can be proven True, assume it is FalseBut what to do if several defaults are contradictory?Which ones to keep? Which one to reject?Very active research field in the 80’sÆ Non-monotonic logics: defaults, circumscription,closed-world assumptionsApplications to databases3WorstWorst--Case ReasoningCase ReasoningCreed: Just the opposite! The world is ruled by Murphy’s LawUncertainty is defined by sets, e.g., the set possible outcomes of an action, the set of possible positions of a robotThe agent assumes the worst case, and chooses the actions that maximizes a utility function in this caseExample: Adversarial searchProbabilistic ReasoningProbabilistic ReasoningCreed: The world is not divided between “normal” and “abnormal”, nor is it adversarial. Possible situations have various likelihoods (probabilities)The agent has probabilistic beliefs –pieces of knowledge with associated probabilities (strengths) – and chooses its actions to maximize the expected value of some utility functionHow do we represent Uncertainty?We need to answer several questions:What do we represent & how we represent it? What language do we use to represent our uncertainty? What are the semantics of our representation?What can we do with the representations? What queries can be answered? How do we answer them?How do we construct a representation? Can we ask an expert? Can we learn from data?ProbabilityA well-known and well-understood framework for uncertaintyClear semanticsProvides principled answers for: Combining evidence Predictive & Diagnostic reasoning Incorporation of new evidenceIntuitive (at some level) to human expertsCan be learnedAxioms of probabilityNotion of ProbabilityNotion of ProbabilityThe probability of a proposition A is a real number P(A) between 0 and 1P(True) = 1 and P(False) = 0P(AvB) = P(A) + P(B) - P(A∧B)You drive on Rt 1 to UMD often, and you notice that 70%of the times there is a traffic slowdown at the intersection of PaintBranch & Rt 1. The next time you plan to drive on Rt 1, you will believe that the proposition “there is a slowdown at the intersection of PB & Rt 1” is True with probability 0.7P(Av¬A) = P(A)+P(¬A)-P(A ∧¬A)P(True) = P(A)+P(¬A)-P(False)1 = P(A) + P(¬A)So:P(A) = 1 - P(¬A)Frequency InterpretationFrequency InterpretationDraw a ball from a urn containing n balls of the same size, r red and s yellow.The probability that the proposition A = “the ball is red” is true corresponds to the relative frequency with which we expect to draw a red ball Æ P(A) = ?4Subjective InterpretationSubjective InterpretationThere are many situations in which there is no objective frequency interpretation: On a windy day, just before paragliding from the top of El Capitan, you say “there is probability 0.05 that I am going to die”  You have worked hard on your AI class and you believe that the probability that you will get an A is 0.9Bayesian


View Full Document

UMD CMSC 421 - Uncertainty

Download Uncertainty
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Uncertainty and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Uncertainty 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?