Unformatted text preview:

UncertaintySlide 2Probability to the RescueProbabilityMaking decisions under uncertaintySyntaxSlide 7Axioms of probabilityPrior probabilityConditional probabilitySlide 11Inference by enumerationSlide 13Slide 14NormalizationSlide 16IndependenceConditional independenceConditional independence cont.Bayes' RuleBayes' Rule and conditional independenceThe Naive Bayes ClassifierLearning a Naive Bayes ClassifierBoosted a Naive Bayes ClassifierSummaryUncertaintyChapter 13UncertaintyLet action At = leave for airport t minutes before flightWill At get me there on time?Problems:1. partial observability (road state, other drivers' plans, etc.)2. noisy sensors (traffic reports)3. uncertainty in action outcomes (flat tire, etc.)4. immense complexity of modeling and predicting trafficHence a purely logical approach either1. risks falsehood: “A25 will get me there on time”, or2. leads to conclusions that are too weak for decision making:“A25 will get me there on time if there's no accident on the bridge and it doesn't rain and my tires remain intact etc etc.”(A1440 might reasonably be said to get me there on time but I'd have to stay overnight in the airport …)Probability to the Rescue•Probability–Model agent's degree of belief, given the available evidence.–A25 will get me there on time with probability 0.04Probability in AI models our ignorance, not the true state of the world.State statement “With probability 0.7 I have a cavity” means:I either have a cavity or not, but I don’t have all the necessaryinformation to know this for sure.ProbabilityProbabilistic assertions summarize effects of–laziness: failure to enumerate exceptions, qualifications, etc.–ignorance: lack of relevant facts, initial conditions, etc.Subjective probability:•Probabilities relate propositions to agent's own state of knowledgee.g., P(A25 | no reported accidents) = 0.06•Probabilities of propositions change with new evidence:e.g., P(A25 | no reported accidents, 5 a.m.) = 0.15Making decisions under uncertaintySuppose I believe the following:P(A25 gets me there on time | …) = 0.04 P(A90 gets me there on time | …) = 0.70 P(A120 gets me there on time | …) = 0.95 P(A1440 gets me there on time | …) = 0.9999 •Which action to choose?Depends on my preferences for missing flight vs. time spent waiting, etc.–Utility theory is used to represent and infer preferences–Decision theory = probability theory + utility theorySyntax•Basic element: random variable•Similar to propositional logic: possible worlds defined by assignment of values to random variables.•Boolean random variablese.g., Cavity (do I have a cavity?)•Discrete random variablese.g., Weather is one of <sunny,rainy,cloudy,snow>•Domain values must be exhaustive and mutually exclusive•Elementary proposition constructed by assignment of a value to a random variable: e.g., Weather = sunny, Cavity = false (abbreviated as cavity)•Complex propositions formed from elementary propositions and standard logical connectives e.g., Weather = sunny  Cavity = falseSyntax•Atomic event: A complete specification of the state of the world about which the agent is uncertainE.g., if the world consists of only two Boolean variables Cavity and Toothache, then there are 4 distinct atomic events:Cavity = false  Toothache = falseCavity = false  Toothache = trueCavity = true  Toothache = falseCavity = true  Toothache = true•Atomic events are mutually exclusive and exhaustiveAxioms of probability•For any propositions A, B–0 ≤ P(A) ≤ 1–P(true) = 1 and P(false) = 0–P(A  B) = P(A) + P(B) - P(A  B)true in all worlds e.g. P(a OR NOT(a))false in all worlds: P(a AND NOT(a))Think of P(a) as the number of worldsin which a is true divided by the total numberof possible worlds.Prior probability•Prior or unconditional probabilities of propositionse.g., P(Cavity = true) = 0.1 and P(Weather = sunny) = 0.72 correspond to belief prior to arrival of any (new) evidence•Probability distribution gives values for all possible assignments:P(Weather) = <0.72,0.1,0.08,0.1> (normalized, i.e., sums to 1)•Joint probability distribution for a set of random variables gives the probability of every atomic event on those random variablesP(Weather,Cavity) = a 4 × 2 matrix of values:Weather = sunny rainy cloudy snow Cavity = true 0.144 0.02 0.016 0.02Cavity = false 0.576 0.08 0.064 0.08•Every question about a domain can be answered by the joint distributionConditional probability•Conditional or posterior probabilitiese.g., P(cavity | toothache) = 0.8i.e., given that toothache is all I know•(Notation for conditional distributions:P(Cavity | Toothache) = 2-element vector of 2-element vectors)•If we know more, e.g., cavity is also given, then we haveP(cavity | toothache,cavity) = 1•New evidence may be irrelevant, allowing simplification, e.g.,P(cavity | toothache, sunny) = P(cavity | toothache) = 0.8•This kind of inference, sanctioned by domain knowledge, is crucialConditional probability•Definition of conditional probability:P(a | b) = P(a  b) / P(b) if P(b) > 0•Product rule gives an alternative formulation:P(a  b) = P(a | b) P(b) = P(b | a) P(a)•A general version holds for whole distributions, e.g.,P(Weather,Cavity) = P(Weather | Cavity) P(Cavity)•(View as a set of 4 × 2 equations, not matrix mult.)•Chain rule is derived by successive application of product rule:P(X1, …,Xn) = P(X1,...,Xn-1) P(Xn | X1,...,Xn-1) = P(X1,...,Xn-2) P(Xn-1 | X1,...,Xn-2) P(Xn | X1,...,Xn-1) = … = πi= 1^n P(Xi | X1, … ,Xi-1)Inference by enumeration•Start with the joint probability distribution:•For any proposition a, sum the atomic events where it is true: P(a) = Σω:ω╞a P(ω)P(a)=1/7 + 1/7 + 1/7 = 3/7Inference by enumeration•Start with the joint probability distribution:•For any proposition φ, sum the atomic events where it is true: P(φ) = Σω:ω╞φ P(ω)•P(toothache) = 0.108 + 0.012 + 0.016 + 0.064 = 0.2Inference by enumeration•Start with the joint probability distribution:•Can also compute conditional probabilities:P(cavity | toothache) = P(cavity  toothache)P(toothache)= 0.016+0.064 0.108 + 0.012 + 0.016 + 0.064= 0.4Normalization•Denominator can be viewed as a normalization constant αP(Cavity | toothache) = α x P(Cavity,toothache) = α x [P(Cavity,toothache,catch) + P(Cavity,toothache, catch)]= α x [<0.108,0.016> + <0.012,0.064>] = α x


View Full Document

UCI ICS 171 - Uncertainty 171105

Documents in this Course
Prolog

Prolog

16 pages

PROJECT

PROJECT

3 pages

Quiz 6

Quiz 6

9 pages

Load more
Download Uncertainty 171105
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Uncertainty 171105 and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Uncertainty 171105 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?