DOC PREVIEW
Pitt CS 2710 - Uncertainty

This preview shows page 1-2-3-4 out of 13 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 13 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 13 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 13 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 13 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 13 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

1UncertaintyChapter 13Outline• Uncertainty• Probability• Syntax and Semantics• Inference• Independence and Bayes' Rule2UncertaintyLet action At= leave for airport tminutes before flightWill Atget me there on time?Problems:1. partial observability (road state, other drivers' plans, etc.)2. noisy sensors (traffic reports)3. uncertainty in action outcomes (flat tire, etc.)4. immense complexity of modeling and predicting trafficHence a purely logical approach either1. risks falsehood: “A25will get me there on time”, or2. leads to conclusions that are too weak for decision making:“A25will get me there on time if there's no accident on the bridge and it doesn't rain and my tires remain intact etc etc.”(A1440might reasonably be said to get me there on time but I'd have to stay overnight in the airport …)But…• A decision must be made!• No intelligent system can afford to consider all eventualities, wait until all the data is in and complete, or try all possibilities to see what happens3Quick Overview of Reasoning Systems• Logic:True or false, nothing in between. No uncertainty• Non-monotonic logic:True or false, but new information can change it.• Probability:Degree of belief, but in the end it’s either true or false• Fuzzy:Degree of belief, allows overlapping of true and false statesExamples• Logic: All birds fly• Non-monotonic– Tweety flies, since he’s a bird and no evidence he doesn’t fly4ProbabilityProbabilistic assertions summarize effects of– laziness: failure to enumerate exceptions, qualifications, etc.– ignorance: lack of relevant facts, initial conditions, etc.Subjective probability:• Probabilities relate propositions to agent's own state of knowledgee.g., P(A25| no reported accidents) = 0.06These are not assertions about the worldProbabilities of propositions change with new evidence:e.g., P(A25| no reported accidents, 5 a.m.) = 0.15Making decisions under uncertaintySuppose I believe the following:P(A25gets me there on time | …) = 0.04 P(A90gets me there on time | …) = 0.70 P(A120 gets me there on time | …) = 0.95 P(A1440gets me there on time | …) = 0.9999• Which action to choose?Depends on my preferences for missing flight vs. time spent waiting, etc.– Utility theory is used to represent and infer preferences– Decision theory = probability theory + utility theory5Syntax• Basic element: random variable• Similar to propositional logic: possible worlds defined by assignment of values to random variables.• Boolean random variablese.g., Cavity (do I have a cavity?)• Discrete random variablese.g., Weather is one of <sunny,rainy,cloudy,snow>• Domain values must be exhaustive and mutually exclusive• Elementary proposition constructed by assignment of a value to a random variable: e.g., Weather = sunny, Cavity = false (abbreviated as ¬cavity)• Complex propositions formed from elementary propositions and standard logical connectives e.g., Weather = sunny ∨ Cavity = falseSyntax• Atomic event: A complete specification of the state of the world about which the agent is uncertainE.g., if the world consists of only two Boolean variables Cavity and Toothache, then there are 4 distinct atomic events:Cavity = false ∧Toothache = falseCavity = false ∧ Toothache = trueCavity = true ∧ Toothache = falseCavity = true ∧ Toothache = true• Atomic events are mutually exclusive and exhaustive6Axioms of probability• For any propositions A, B– 0 ≤ P(A) ≤ 1– P(true) = 1 and P(false) = 0– P(A ∨ B) = P(A) + P(B) - P(A ∧ B)Prior probability• Prior or unconditional probabilities of propositionse.g., P(Cavity = true) = 0.1 and P(Weather = sunny) = 0.72 correspond to belief prior to arrival of any (new) evidence• Probability distribution gives values for all possible assignments:P(Weather) = <0.72,0.1,0.08,0.1> (normalized, i.e., sums to 1)• Joint probability distribution for a set of random variables gives the probability of every atomic event on those random variablesP(Weather,Cavity) = a 4 × 2 matrix of values:Weather = sunny rainy cloudy snow Cavity = true 0.144 0.02 0.016 0.02Cavity = false 0.576 0.08 0.064 0.08• Every question about a domain can be answered by the joint distribution7How could we estimate the full joint distribution?Parameter estimates are provided by expert knowledge, statistics on data samples, or a combination of both.Suppose you have 20 variables.Expert knowledge: P(X1=0,X2=0,…,X13=1,…,X20=0) vs.P(X1=0,X2=0,…,X13=0,…,X20=0)?Data Samples: practically speaking, we don’t typically have enough data Conditional probability• Conditional or posterior probabilitiese.g., P(cavity | toothache) = 0.8i.e., given that toothache is all I know• (Notation for conditional distributions:P(Cavity | Toothache) = 2-element vector of 2-element vectors)• If we know more, e.g., cavity is also given, then we haveP(cavity | toothache,cavity) = 1• New evidence may be irrelevant, allowing simplification, e.g.,P(cavity | toothache, sunny) = P(cavity | toothache) = 0.8• This kind of inference, sanctioned by domain knowledge, is crucial8More on Conditional Probabilities• P (CarWontStart | NoGas)– This predicts a symptom based on an underlying cause– These can be generated empirically (Drain N gastanks, see how many cars start) or using expert knowledge• P (NoGas | CarWontStart)– Diagnosis. We have a symptom and want to predict the cause. This is what the system wants to determineConditional probability• Definition of conditional probability:P(a | b) = P(a ∧ b) / P(b) if P(b) > 0• Product rule gives an alternative formulation:P(a ∧ b) = P(a | b) P(b) = P(b | a) P(a)• A general version holds for whole distributions, e.g.,P(Weather,Cavity) = P(Weather | Cavity) P(Cavity)• (View as a set of 4 × 2 equations, not matrix mult.)• Chain rule is derived by successive application of product rule:P(X1, …,Xn) = P(X1,...,Xn-1) P(Xn| X1,...,Xn-1)= P(X1,...,Xn-2) P(Xn-1| X1,...,Xn-2) P(Xn| X1,...,Xn-1)= …= πi= 1^n P(Xi| X1, … ,Xi-1)9Inference by enumeration• Start with the joint probability distribution:• For any proposition φ, sum the atomic events where it is true: P(φ) = Σω:ω╞φP(ω)Inference by enumeration• Start with the joint probability distribution:• For any proposition φ, sum the atomic events where it is true: P(φ) = Σω:ω╞φP(ω)• P(toothache) = 0.108 + 0.012 + 0.016 + 0.064 = 0.210Inference by enumeration•


View Full Document

Pitt CS 2710 - Uncertainty

Documents in this Course
Learning

Learning

24 pages

Planning

Planning

25 pages

Lecture

Lecture

12 pages

Load more
Download Uncertainty
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Uncertainty and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Uncertainty 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?