DOC PREVIEW
Berkeley COMPSCI 188 - Lecture 13: Probability

This preview shows page 1-2-24-25 out of 25 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 25 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 25 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 25 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 25 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 25 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

CS 188: Artificial Intelligence Fall 2009AnnouncementsTodayInference in GhostbustersUncertaintyRandom VariablesProbability DistributionsJoint DistributionsProbabilistic ModelsEventsMarginal DistributionsConditional ProbabilitiesConditional DistributionsNormalization TrickProbabilistic InferenceInference by EnumerationSlide 17The Product RuleThe Chain RuleBayes’ RuleInference with Bayes’ RuleGhostbusters, RevisitedIndependenceExample: Independence?Example: IndependenceCS 188: Artificial IntelligenceFall 2009Lecture 13: Probability10/8/2009Dan Klein – UC Berkeley1AnnouncementsUpcomingP3 Due 10/12W2 Due 10/15Midterm in evening of 10/22Review sessions:Probability review: Friday 12-2pm in 306 SodaMidterm review: on web page when confirmed2TodayProbabilityRandom VariablesJoint and Marginal DistributionsConditional DistributionProduct Rule, Chain Rule, Bayes’ RuleInferenceIndependenceYou’ll need all this stuff A LOT for the next few weeks, so make sure you go over it now!3Inference in GhostbustersA ghost is in the grid somewhereSensor readings tell how close a square is to the ghostOn the ghost: red1 or 2 away: orange3 or 4 away: yellow5+ away: greenP(red | 3) P(orange | 3) P(yellow | 3) P(green | 3)0.05 0.15 0.5 0.3 Sensors are noisy, but we know P(Color | Distance)[Demo]UncertaintyGeneral situation:Evidence: Agent knows certain things about the state of the world (e.g., sensor readings or symptoms)Hidden variables: Agent needs to reason about other aspects (e.g. where an object is or what disease is present)Model: Agent knows something about how the known variables relate to the unknown variablesProbabilistic reasoning gives us a framework for managing our beliefs and knowledge5Random VariablesA random variable is some aspect of the world about which we (may) have uncertaintyR = Is it raining?D = How long will it take to drive to work?L = Where am I?We denote random variables with capital lettersLike variables in a CSP, random variables have domainsR in {true, false} (sometimes write as {+r, r})D in [0, )L in possible locations, maybe {(0,0), (0,1), …}6Probability DistributionsUnobserved random variables have distributionsA distribution is a TABLE of probabilities of valuesA probability (lower case value) is a single numberMust have: 7T Pwarm 0.5cold 0.5W Psun 0.6rain 0.1fog 0.3meteor 0.0Joint DistributionsA joint distribution over a set of random variables:specifies a real number for each assignment (or outcome): Size of distribution if n variables with domain sizes d?Must obey:For all but the smallest distributions, impractical to write outT W Phot sun 0.4hot rain 0.1cold sun 0.2cold rain 0.38Probabilistic ModelsA probabilistic model is a joint distribution over a set of random variablesProbabilistic models:(Random) variables with domains Assignments are called outcomesJoint distributions: say whether assignments (outcomes) are likelyNormalized: sum to 1.0Ideally: only certain variables directly interactConstraint satisfaction probs:Variables with domainsConstraints: state whether assignments are possibleIdeally: only certain variables directly interactT W Phot sun 0.4hot rain 0.1cold sun 0.2cold rain 0.3T W Phot sun Thot rain Fcold sun Fcold rain T9Distribution over T,WConstraint over T,WEventsAn event is a set E of outcomesFrom a joint distribution, we can calculate the probability of any eventProbability that it’s hot AND sunny?Probability that it’s hot?Probability that it’s hot OR sunny?Typically, the events we care about are partial assignments, like P(T=hot) T W Phot sun 0.4hot rain 0.1cold sun 0.2cold rain 0.310Marginal DistributionsMarginal distributions are sub-tables which eliminate variables Marginalization (summing out): Combine collapsed rows by addingT W Phot sun 0.4hot rain 0.1cold sun 0.2cold rain 0.3T Phot 0.5cold 0.5W Psun 0.6rain 0.411Conditional ProbabilitiesA simple relation between joint and conditional probabilitiesIn fact, this is taken as the definition of a conditional probabilityT W Phot sun 0.4hot rain 0.1cold sun 0.2cold rain 0.312Conditional DistributionsConditional distributions are probability distributions over some variables given fixed values of othersT W Phot sun 0.4hot rain 0.1cold sun 0.2cold rain 0.3W Psun 0.8rain 0.2W Psun 0.4rain 0.6Conditional DistributionsJoint Distribution13Normalization TrickA trick to get a whole conditional distribution at once:Select the joint probabilities matching the evidenceNormalize the selection (make it sum to one)Why does this work? Sum of selection is P(evidence)! (P(r), here) T W Phot sun 0.4hot rain 0.1cold sun 0.2cold rain 0.3T R Phot rain 0.1cold rain 0.3T Phot 0.25cold 0.75SelectNormalize14Probabilistic InferenceProbabilistic inference: compute a desired probability from other known probabilities (e.g. conditional from joint)We generally compute conditional probabilities P(on time | no reported accidents) = 0.90These represent the agent’s beliefs given the evidenceProbabilities change with new evidence:P(on time | no accidents, 5 a.m.) = 0.95P(on time | no accidents, 5 a.m., raining) = 0.80Observing new evidence causes beliefs to be updated15Inference by EnumerationP(sun)?P(sun | winter)?P(sun | winter, warm)?S T W Psummer hot sun 0.30summer hot rain 0.05summer cold sun 0.10summer cold rain 0.05winter hot sun 0.10winter hot rain 0.05winter cold sun 0.15winter cold rain 0.2016Inference by EnumerationGeneral case:Evidence variables: Query* variable:Hidden variables:We want:First, select the entries consistent with the evidenceSecond, sum out H to get joint of Query and evidence:Finally, normalize the remaining entries to conditionalizeObvious problems:Worst-case time complexity O(dn) Space complexity O(dn) to store the joint distributionAll variables* Works fine with multiple query variables, tooThe Product RuleSometimes have conditional distributions but want the jointExample:R Psun 0.8rain 0.2D W Pwet sun 0.1dry sun 0.9wet rain 0.7dry rain 0.3D W Pwet sun 0.08dry sun 0.72wet rain 0.14dry rain 0.0618The Chain RuleMore generally, can always write any joint distribution as an incremental product of conditional distributionsWhy is this always true?19Bayes’ RuleTwo ways to factor a joint


View Full Document

Berkeley COMPSCI 188 - Lecture 13: Probability

Documents in this Course
CSP

CSP

42 pages

Metrics

Metrics

4 pages

HMMs II

HMMs II

19 pages

NLP

NLP

23 pages

Midterm

Midterm

9 pages

Agents

Agents

8 pages

Lecture 4

Lecture 4

53 pages

CSPs

CSPs

16 pages

Midterm

Midterm

6 pages

MDPs

MDPs

20 pages

mdps

mdps

2 pages

Games II

Games II

18 pages

Load more
Download Lecture 13: Probability
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Lecture 13: Probability and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Lecture 13: Probability 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?