Uncertain ReasoningReasoning in Complex Domains or SituationsForms of Uncertain ReasoningMaking Decisions to Meet GoalsQuick QuestionQuick Question 2Basics of ProbabilitySlide 8Slide 9Bayes’ RuleReasoning with Bayes’ RuleSlide 12Slide 13Slide 14Combining EvidenceIndependence of EventsConditional IndependenceSlide 18Human ReasoningSlide 20Uncertain Reasoning CPSC 315 – Programming StudioSpring 2009Project 2, Lecture 6Reasoning in Complex Domains or SituationsReasoning often involves moving from evidence about the world to decisionsSystems almost never have access to the whole truth about their environmentReasons for lack of knowledgeCost/benefit trade-off in knowledge engineeringLess likely, less influential factors often not included in modelNo complete theory of domainComplete theories are few and far betweenIncomplete knowledge of situationAcquiring all knowledge of situation is impracticalForms of Uncertain ReasoningPartially-believed domain featuresE.g. chance of rain = 80%Probability (focus of today’s lecture)Other (we will return to this)Partially-true domain featuresE.g. cloudy = .8Fuzzy logic (outside scope of this class)Making Decisions to Meet GoalsDecision theory = Probability theory +Utility theoryDecisions – the outcome of system’s reasoning, actions to take or avoidProbability – how system reasonsUtility – system’s goals / preferencesQuick QuestionYou go to the doctor and are tested for a disease. The test is 98% accurate if you have the disease. 3.6% of the population has the disease while 4% of the population tests positive.How likely is it you have the disease?Quick Question 2You go to the doctor and are tested for a disease. The test is 98% accurate if you have the disease. 3.6% of the population has the disease while 7% of the population tests positive.How likely is it you have the disease?Basics of ProbabilityUnconditional or prior probabilityDegree of belief of something being true in absence of any informationP (cavity = true) = 0.1 or P (cavity) = 0.1Implies P (not cavity) = 0.9Basics of ProbabilityUnconditional or prior probabilityCan be for a set of valuesP (Weather = sunny) = 0.7P (Weather = rain) = 0.2P (Weather = cloudy) = .08P (Weather = snow) = .02Note: Weather can have only a single value – system must know that rain and snow implies cloudsBasics of ProbabilityConditional or posterior probabilityDegree of belief of something being true given knowledge about situationP (cavity | toothache) = 0.8Mathematically, we knowP (a | b) = P (a ^ b) / P (b)Requires system to know unconditional probability of combinations of featuresThis knowledge becomes exponential relative to the size of the feature setBayes’ RuleRemember: P (a | b) = P (a ^ b) / P (b)Can be rewrittenP (a ^ b) = P (a | b) * P (b)Swapping a and b features yieldsP (a ^ b) = P (b | a) * P (a)ThusP (b | a) * P (a) = P (a | b) * P (b) Rewriting we get Bayes’ RuleP (b | a) = P (a | b) * P (b) / P (a)Reasoning with Bayes’ RuleBayes’ RuleP (b | a) = P (a | b) * P (b) / P (a)ExampleLet’s take P (disease) = 0.036P (test) = 0.04P (test | disease) = 0.98P (disease | test) = ?Reasoning with Bayes’ RuleBayes’ RuleP (b | a) = P (a | b) * P (b) / P (a)ExampleP (disease) = 0.036P (test) = 0.04P (test | disease) = 0.98P (disease | test) = ? = P (test | disease) * P (disease) / P (test) = 0.98 * 0.036 / 0.04 = 88.2 %Reasoning with Bayes’ RuleWhat if test has more false positivesStill 98% accurate for those with diseaseExampleP (disease) = 0.036P (test) = 0.07P (test | disease) = 0.98P (disease | test) = ? = P (test | disease) * P (disease) / P (test) = 0.98 * 0.036 / 0.07 = 50.4 %Reasoning with Bayes’ RuleWhat if test has more false negativesNow 90% accurate for those with diseaseExampleP (disease) = 0.036P (test) = 0.04P (test | disease) = 0.90P (disease | test) = ? = P (test | disease) * P (disease) / P (test) = 0.90 * 0.036 / 0.04 = 81 %Combining EvidenceWhat happens when we have more than one piece of evidenceExample: toothache and tool catches on toothP (cavity | toothache ^ catch) = ?Problem: toothache and catch are not independentIf someone has a toothache there is a greater chance they will have a catch and vice-versaIndependence of EventsIndependence of features / eventsFeatures / events cannot be used to predict each otherExample: values rolled on two separate dieExample: hair color and food preferenceProbabilistic reasoning works because systems divide domain into independent sub-domainsDo not need the exponentially increasing data to understand interactionsUnfortunately, non-independent sub-domains can still be huge (have many interacting features)Conditional IndependenceWhat happens when we have more than one piece of evidenceExample: toothache and tool catches on toothP (cavity | toothache ^ catch) = ?Conditional independenceAssume indirect relationshipExample: toothache and catch are both caused by cavity but not any other featureThen P (toothache ^ catch | cavity) = P (toothache | cavity) * P (catch | cavity)Conditional IndependenceThis let’s us sayP (toothache ^ catch | cavity)= P (toothache | cavity) * P (catch | cavity)P (cavity | toothache ^ catch) = ?= P (toothache ^ catch | cavity) * P (cavity)= P (toothache | cavity) * P (catch | cavity) * P (cavity)Avoids requiring system to have data on all permutationsDifficulty: How true?What about a chipped or cracked tooth?Human ReasoningStudies show people, without training and prompting, do not reason probabilisticallyPeople make incorrect inferences when confronted with probabilities like those of the last few slidesIf asked for all prior and posterior probabilities then they will posit systems with rather large inconsistenciesHuman ReasoningStudies show people, without training, do not reason probabilisticallySome systems have used non-probabilistic forms of uncertain reasoningQualitative categories rather than numbers Must be true, highly likely, likely, some chance, unlikely, virtually impossible, impossibleRules for how these combine based on human
View Full Document