DOC PREVIEW
UCI ICS 171 - Uncertainty

This preview shows page 1-2-3-24-25-26 out of 26 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 26 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 26 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 26 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 26 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 26 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 26 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 26 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Uncertainty Chapter 13Uncertainty Let action At = leave for airport t minutes before flight Will At get me there on time? Problems: 1. partial observability (road state, other drivers' plans, noisy sensors) 2. uncertainty in action outcomes (flat tire, etc.) 3. immense complexity of modeling and predicting traffic Hence a purely logical approach either 1. risks falsehood: “A25 will get me there on time”, or 2. leads to conclusions that are too weak for decision making: “A25 will get me there on time if there's no accident on the bridge and it doesn't rain and my tires remain intact etc etc.” (A1440 might reasonably be said to get me there on time but I'd have to stay overnight in the airport …)Probability to the Rescue • Probability – Model agent's degree of belief, given the available evidence. – A25 will get me there on time with probability 0.04 Probability in AI models our ignorance, not the true state of the world. The statement “With probability 0.7 I have a cavity” means: I either have a cavity or not, but I don’t have all the necessary information to know this for sure.Probability Subjective probability: • Probabilities relate propositions to agent's own state of knowledge e.g., P(A25 | no reported accidents at 3 a.m. ) = 0.06 • Probabilities of propositions change with new evidence: e.g., P(A25 | no reported accidents at 5 a.m.) = 0.15Making decisions under uncertainty Suppose I believe the following: P(A25 gets me there on time | …) = 0.04 P(A90 gets me there on time | …) = 0.70 P(A120 gets me there on time | …) = 0.95 P(A1440 gets me there on time | …) = 0.9999 • Which action to choose? Depends on my preferences for missing flight vs. time spent waiting, etc. – Utility theory is used to represent and infer preferences – Decision theory = probability theory + utility theorySyntax • Basic element: random variable • Similar to propositional logic: possible worlds defined by assignment of values to random variables. • Boolean random variables e.g., Cavity (do I have a cavity?) • Discrete random variables e.g., Weather is one of <sunny,rainy,cloudy,snow> • Elementary proposition constructed by assignment of a value to a random variable: e.g., Weather = sunny, Cavity = false (abbreviated as ¬cavity) • Complex propositions formed from elementary propositions and standard logical connectives e.g., Weather = sunny ∨ Cavity = false Capital letter: random variable lower case: single valueSyntax • Atomic event: A complete specification of the state of the world about which the agent is uncertain (i.e. a full assignment of values to all variables in the universe, a unique single world). E.g., if the world consists of only two Boolean variables Cavity and Toothache, then there are 4 distinct atomic events: Cavity = false ∧ Toothache = false Cavity = false ∧ Toothache = true Cavity = true ∧ Toothache = false Cavity = true ∧ Toothache = true • Atomic events are mutually exclusive and exhaustive if some atomic event is true, then all other other atomic events are false. There is always some atomic event true. Hence, there is exactly 1 atomic event true.Axioms of probability • For any propositions A, B – 0 ≤ P(A) ≤ 1 – P(true) = 1 and P(false) = 0 – P(A ∨ B) = P(A) + P(B) - P(A ∧ B) true in all worlds e.g. P(a OR NOT(a)) false in all worlds: P(a AND NOT(a)) Think of P(A) as the number of worlds in which A is true divided by the total number of possible worlds.Prior probability • Prior or unconditional probabilities of propositions e.g., P(Cavity = true) = 0.1 and P(Weather = sunny) = 0.72 correspond to belief prior to arrival of any (new) evidence • Probability distribution gives values for all possible assignments: P(Weather) = <0.72,0.1,0.08,0.1> (normalized, i.e., sums to 1) • Joint probability distribution for a set of random variables gives the probability of every atomic event of those random variables P(Weather,Cavity) = a 4 × 2 matrix of values: Weather = sunny rainy cloudy snow Cavity = true 0.144 0.02 0.016 0.02 Cavity = false 0.576 0.08 0.064 0.08 • Every question about a domain can be answered by the joint distributionConditional probability • Conditional or posterior probabilities e.g., P(cavity | toothache) = 0.8 i.e., given that Toothache=true is all I know. • Note that P(Cavity|Toothache) is a 2x2 array, normalized over columns. • If we know more, e.g., cavity is also given, then we have P(cavity | toothache,cavity) = 1 • New evidence may be irrelevant, allowing simplification, e.g., P(cavity | toothache, sunny) = P(cavity | toothache) = 0.8Conditional probability • Definition of conditional probability: P(a | b) = P(a ∧ b) / P(b) if P(b) > 0 • Product rule gives an alternative formulation: P(a ∧ b) = P(a | b) P(b) = P(b | a) P(a) • Bayes Rule: P(a|b) = P(b|a) P(a) / P(b) • A general version holds for whole distributions, e.g., P(Weather,Cavity) = P(Weather | Cavity) P(Cavity) • (View as a set of 4 × 2 equations, not matrix multiplication) • Chain rule is derived by successive application of product rule: P(X1, …,Xn) = P(X1,...,Xn-1) P(Xn | X1,...,Xn-1) = P(X1,...,Xn-2) P(Xn-1 | X1,...,Xn-2) P(Xn | X1,...,Xn-1) = … = πi= 1^n P(Xi | X1, … ,Xi-1)Inference by enumeration • Start with the joint probability distribution: • For any proposition a, sum the atomic events where it is true: P(a) = Σω s.t. a=true P(ω) P(a)=1/7 + 1/7 + 1/7 = 3/7Inference by enumeration • Start with the joint probability distribution: • For any proposition a, sum the atomic events where it is true: P(a) = Σω:ω s.t. a=true P(ω) • P(toothache) = 0.108 + 0.012 + 0.016 + 0.064 = 0.2Inference by enumeration • Start with the joint probability distribution: • Can also compute conditional probabilities: P(¬cavity | toothache) = P(¬cavity ∧ toothache) P(toothache) = 0.016+0.064 0.108 + 0.012 + 0.016 + 0.064 = 0.4Normalization • Denominator can be viewed as a normalization constant α P(Cavity | toothache) = α x P(Cavity,toothache) = α x [P(Cavity,toothache,catch) + P(Cavity,toothache,¬ catch)] = α x [<0.108,0.016> + <0.012,0.064>] = α x <0.12,0.08> = <0.6,0.4> General idea: compute


View Full Document

UCI ICS 171 - Uncertainty

Documents in this Course
Prolog

Prolog

16 pages

PROJECT

PROJECT

3 pages

Quiz 6

Quiz 6

9 pages

Load more
Download Uncertainty
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Uncertainty and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Uncertainty 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?