DOC PREVIEW
Berkeley COMPSCI 188 - Lecture 13 Bayes’ Nets

This preview shows page 1-2 out of 5 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 5 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 5 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 5 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

1CS 188: Artificial IntelligenceFall 2011Lecture 13: Bayes’ Nets10/6/2011Dan Klein – UC BerkeleyProbabilistic Models Models describe how (a portion of) the world works Models are always simplifications May not account for every variable May not account for all interactions between variables “All models are wrong; but some are useful.”– George E. P. Box What do we do with probabilistic models? We (or our agents) need to reason about unknown variables, given evidence Example: explanation (diagnostic reasoning) Example: prediction (causal reasoning) Example: value of information2Model for GhostbustersT B G P(T,B,G)+t +b +g 0.16+t +b ¬g 0.16+t ¬b +g 0.24+t ¬b ¬g 0.04 ¬t +b +g 0.04¬t +b ¬g 0.24¬t ¬b +g 0.06¬t ¬b ¬g 0.06 Reminder: ghost is hidden, sensors are noisy T: Top sensor is redB: Bottom sensor is redG: Ghost is in the top Queries:P( +g) = ??P( +g | +t) = ??P( +g | +t, -b) = ?? Problem: jointdistribution toolarge / complexJoint DistributionIndependence Two variables are independent if: This says that their joint distribution factors into a product two simpler distributions Another form: We write:  Independence is a simplifying modeling assumption Empirical joint distributions: at best “close” to independent What could we assume for {Weather, Traffic, Cavity, Toothache}?5Example: Independence N fair, independent coin flips:h 0.5t 0.5h 0.5t 0.5h 0.5t 0.56Example: Independence?T W Pwarm sun 0.4warm rain 0.1cold sun 0.2cold rain 0.3T W Pwarm sun 0.3warm rain 0.2cold sun 0.3cold rain 0.2T Pwarm 0.5cold 0.5W Psun 0.6rain 0.472Conditional Independence P(Toothache, Cavity, Catch) If I have a cavity, the probability that the probe catches in it doesn't depend on whether I have a toothache: P(+catch | +toothache, +cavity) = P(+catch | +cavity) The same independence holds if I don’t have a cavity: P(+catch | +toothache, ¬cavity) = P(+catch| ¬cavity) Catch is conditionally independent of Toothache given Cavity: P(Catch | Toothache, Cavity) = P(Catch | Cavity) Equivalent statements: P(Toothache | Catch , Cavity) = P(Toothache | Cavity) P(Toothache, Catch | Cavity) = P(Toothache | Cavity) P(Catch | Cavity) One can be derived from the other easily8Conditional Independence Unconditional (absolute) independence very rare (why?) Conditional independence is our most basic and robust form of knowledge about uncertain environments: What about this domain: Traffic Umbrella Raining What about fire, smoke, alarm?9The Chain Rule Trivial decomposition: With assumption of conditional independence: Bayes’ nets / graphical models help us express conditional independence assumptions10Ghostbusters Chain RuleT B G P(T,B,G)+t +b +g 0.16+t +b ¬g 0.16+t ¬b +g 0.24+t ¬b ¬g 0.04 ¬t +b +g 0.04¬t +b ¬g 0.24¬t ¬b +g 0.06¬t ¬b ¬g 0.06 Each sensor depends onlyon where the ghost is That means, the two sensors are conditionally independent, given the ghost position T: Top square is redB: Bottom square is redG: Ghost is in the top Givens:P( +g ) = 0.5P( +t | +g ) = 0.8P( +t | ¬g ) = 0.4P( +b | +g ) = 0.4P( +b | ¬g ) = 0.8P(T,B,G) = P(G) P(T|G) P(B|G)Bayes’ Nets: Big Picture Two problems with using full joint distribution tables as our probabilistic models: Unless there are only a few variables, the joint is WAY too big to represent explicitly Hard to learn (estimate) anything empirically about more than a few variables at a time Bayes’ nets: a technique for describing complex joint distributions (models) using simple, local distributions (conditional probabilities) More properly called graphical models We describe how variables locally interact Local interactions chain together to give global, indirect interactions For about 10 min, we’ll be vague about how these interactions are specified12Example Bayes’ Net: Insurance133Example Bayes’ Net: Car14Graphical Model Notation Nodes: variables (with domains) Can be assigned (observed) or unassigned (unobserved) Arcs: interactions Similar to CSP constraints Indicate “direct influence” between variables Formally: encode conditional independence (more later) For now: imagine that arrows mean direct causation (in general, they don’t!)15Example: Coin FlipsX1X2Xn N independent coin flips No interactions between variables: absolute independence16Example: Traffic Variables: R: It rains T: There is traffic Model 1: independence Model 2: rain causes traffic Why is an agent using model 2 better?RT17Example: Traffic II Let’s build a causal graphical model Variables T: Traffic R: It rains L: Low pressure D: Roof drips B: Ballgame C: Cavity18Example: Alarm Network Variables B: Burglary A: Alarm goes off M: Mary calls J: John calls E: Earthquake!194Bayes’ Net Semantics Let’s formalize the semantics of a Bayes’ net A set of nodes, one per variable X A directed, acyclic graph A conditional distribution for each node A collection of distributions over X, one for each combination of parents’ values CPT: conditional probability table Description of a noisy “causal” processA1XAnA Bayes net = Topology (graph) + Local Conditional Probabilities21Probabilities in BNs Bayes’ nets implicitly encode joint distributions As a product of local conditional distributions To see what probability a BN gives to a full assignment, multiply all the relevant conditionals together: Example: This lets us reconstruct any entry of the full joint Not every BN can represent every joint distribution The topology enforces certain conditional independencies22Example: Coin Flipsh 0.5t 0.5h 0.5t 0.5h 0.5t 0.5X1X2XnOnly distributions whose variables are absolutely independent can be represented by a Bayes’ net with no arcs.23Example: TrafficRT+r 1/4¬r3/4+r +t 3/4¬t1/4¬r+t 1/2¬t1/224Example: Alarm NetworkBurglaryEarthqkAlarmJohn callsMary callsB P(B)+b 0.001¬b0.999E P(E)+e 0.002¬e0.998B E A P(A|B,E)+b +e +a 0.95+b +e¬a0.05+b¬e+a 0.94+b¬e ¬a0.06¬b+e +a 0.29¬b+e¬a0.71¬b ¬e+a 0.001¬b ¬e ¬a0.999A J P(J|A)+a +j 0.9+a¬j0.1¬a+j 0.05¬a ¬j0.95A M P(M|A)+a +m 0.7+a¬m0.3¬a+m 0.01¬a ¬m0.99Example: Traffic Causal directionRTr 1/4¬r3/4r t 3/4¬t1/4¬rt 1/2¬t1/2r t 3/16r¬t1/16¬rt 6/16¬r ¬t6/16265Example: Reverse Traffic Reverse


View Full Document

Berkeley COMPSCI 188 - Lecture 13 Bayes’ Nets

Documents in this Course
CSP

CSP

42 pages

Metrics

Metrics

4 pages

HMMs II

HMMs II

19 pages

NLP

NLP

23 pages

Midterm

Midterm

9 pages

Agents

Agents

8 pages

Lecture 4

Lecture 4

53 pages

CSPs

CSPs

16 pages

Midterm

Midterm

6 pages

MDPs

MDPs

20 pages

mdps

mdps

2 pages

Games II

Games II

18 pages

Load more
Download Lecture 13 Bayes’ Nets
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Lecture 13 Bayes’ Nets and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Lecture 13 Bayes’ Nets 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?