DOC PREVIEW
Berkeley COMPSCI 188 - Lecture 13: Probability

This preview shows page 1-2-23-24 out of 24 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 24 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 24 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 24 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 24 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 24 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

1CS 188: Artificial IntelligenceFall 2008Lecture 13: ProbabilityLecture 13: Probability10/9/2008Dan Klein – UC Berkeley12Today Probability Random Variables Joint and Conditional DistributionsInference, Bayes’ RuleInference, Bayes’ Rule Independence You’ll need all this stuff for the next few weeks, so make sure you go over it!23Uncertainty General situation: Evidence: Agent knows certain things about the state of the world (e.g., sensor readings or symptoms) Hidden variables: Agent needs to reason about other aspects to reason about other aspects (e.g. where an object is or what disease is present) Model: Agent knows something about how the known variables relate to the unknown variables Probabilistic reasoning gives us a framework for managing our beliefs and knowledge34Random Variables A random variable is some aspect of the world about which we (may) have uncertainty R = Is it raining? D = How long will it take to drive to work? L = Where am I? We denote random variables with capital letters Like in a CSP, each random variable has a domain R in {true, false} (often write as {r, ¬r}) D in [0, ∞) L in possible locations45Probabilities We generally calculate conditional probabilities  P(on time | no reported accidents) = 0.90 These represent the agent’s beliefs given the evidence Probabilities change with new evidence: P(on time | no reported accidents, 5 a.m.) = 0.95 P(on time | no reported accidents, 5 a.m., raining) = 0.80 Observing new evidence causes beliefs to be updated56Probabilistic Models CSPs: Variables with domains Constraints: state whether assignments are possible Ideally: only certain variables directly interactT W Phot sun Thot rain Fcold sun FcoldrainT Probabilistic models: (Random) variables with domains Assignments are called outcomes Joint distributions: say whether assignments (outcomes) are likely Normalized: sum to 1.0 Ideally: only certain variables directly interactT W Phot sun 0.4hot rain 0.1cold sun 0.2cold rain 0.3coldrainT67Joint Distributions A joint distribution over a set of random variables:specifies a real number for each assignment (or outcome): T W Photsun0.4 Size of distribution if n variables with domain sizes d? Must obey: For all but the smallest distributions, impractical to write outhotsun0.4hot rain 0.1cold sun 0.2cold rain 0.378Events An event is a set E of outcomes From a joint distribution, we can calculate the probability of any eventT W Phot sun 0.4hot rain 0.1cold sun 0.2coldrain0.3the probability of any event Probability that it’s hot AND sunny? Probability that it’s hot? Probability that it’s hot OR sunny? Typically, the events we care about are partial assignments, like P(T=h)coldrain0.389Marginal Distributions Marginal distributions are sub-tables which eliminate variables  Marginalization (summing out): Combine collapsed rows by addingTWPT Phot0.5TWPhot sun 0.4hot rain 0.1cold sun 0.2cold rain 0.3hot0.5cold 0.5W Psun 0.6rain 0.4910Conditional Distributions Conditional distributions are probability distributions over some variables given fixed values of othersConditional DistributionsJoint DistributionT W Phot sun 0.4hot rain 0.1cold sun 0.2cold rain 0.3W Psun 0.8rain 0.2W Psun 0.4rain 0.61011Conditional Distributions A simple relation between joint and conditional probabilities In fact, this is taken as the definition of a conditional probabilityT W Phot sun 0.4hot rain 0.1cold sun 0.2cold rain 0.31112Normalization Trick A trick to get a whole conditional distribution at once: Select the joint probabilities matching the evidence Normalize the selection (make it sum to one)T W P Why does this work? Because sum of selection is P(evidence)!hot sun 0.4hot rain 0.1cold sun 0.2cold rain 0.3T Phot 0.1cold 0.3T Phot 0.25cold 0.75SelectNormalize1313The Product Rule Sometimes have a joint distribution but want a conditional Sometimes the reverse Example:R Psun 0.8rain 0.2D W Pwet sun 0.1dry sun 0.9wet rain 0.7dry rain 0.3D W Pwet sun 0.08dry sun 0.72wet rain 0.14dry rain 0.061414Bayes’ Rule Two ways to factor a joint distribution over two variables: Dividing, we get:That’s my rule! Why is this at all helpful? Lets us build one conditional from its reverse Often one conditional is tricky but the other one is simple Foundation of many systems we’ll see later (e.g. ASR, MT) In the running for most important AI equation!1515Inference with Bayes’ Rule Example: Diagnostic probability from causal probability: Example:m is meningitis, s is stiff neckm is meningitis, s is stiff neck Note: posterior probability of meningitis still very small Note: you should still get stiff necks checked out! Why?Examplegivens1616Battleship Let’s say we have two distributions: Prior distribution over ship locations: P(L) Say this is uniform (for now) Sensor reading model: P(R | L) Given by some known black box process E.g. P(R = yellow | L=(1,1)) = 0.1 For now, assume the reading is always for the lower left corner We can calculate the posterior distribution over ship locations using Bayes’ rule:1717Inference by Enumeration P(sun)? P(sun | winter)?S T W Psummer hot sun 0.30summer hot rain 0.05summer cold sun 0.10 P(sun | winter, warm)?summer cold rain 0.05winter hot sun 0.10winter hot rain 0.05winter cold sun 0.15winter cold rain 0.201818Independence Two variables are independent in a joint distribution if: This says that their joint distribution factors into a product two simpler distributionsUsually variable aren’t independent!Usually variable aren’t independent! Can use independence as a modeling assumption Independence can be a simplifying assumption Empirical joint distributions: at best “close” to independent What could we assume for {Weather, Traffic, Cavity}? Independence is like something from CSPs: what?2019Example: Independence N fair, independent coin flips:H 0.5T0.5H 0.5T0.5H 0.5T0.5T0.5T0.5T0.52120Example: Independence? Arbitrary joint distributions can be poorly modeled by independent factorsT Pwarm 0.5cold 0.5W Psun 0.6rain 0.4T W Phot sun 0.4hot rain 0.1cold sun 0.2cold rain 0.3T S Pwarm sun 0.3warm rain 0.2cold sun 0.3cold rain 0.22221Conditional Independence Warning: we’re going to use domain knowledge, not laws of


View Full Document

Berkeley COMPSCI 188 - Lecture 13: Probability

Documents in this Course
CSP

CSP

42 pages

Metrics

Metrics

4 pages

HMMs II

HMMs II

19 pages

NLP

NLP

23 pages

Midterm

Midterm

9 pages

Agents

Agents

8 pages

Lecture 4

Lecture 4

53 pages

CSPs

CSPs

16 pages

Midterm

Midterm

6 pages

MDPs

MDPs

20 pages

mdps

mdps

2 pages

Games II

Games II

18 pages

Load more
Download Lecture 13: Probability
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Lecture 13: Probability and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Lecture 13: Probability 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?