Unformatted text preview:

1122c:145 Artificial IntelligenceBayesian Networks•Reading: Ch 14. Russell & Norvig2Review of Probability Theory•Random Variables• The probability that a random variable X has value val is written as P(X=val)•P: domain → [0, 1]– Sums to 1 over the domain: » P(Raining = true) = P(Raining) = 0.2» P(Raining = false) = P(¬ Raining) = 0.8• Joint distribution: • P(X1, X2, …, Xn)• Probability assignment to all combinations of values of random variables and provide complete information about the probabilities of its random variables.• A JPD table for n random variables, each ranging over k distinct values, has knentries!0.890.01¬ Cavity0.060.04Cavity¬ToothacheToothache3Review of Probability Theory• Conditioning• P(A) = P(A | B) P(B) + P(A | ¬B) P(¬B)= P(A ∧ B) + P(A ∧¬B)•A and B are independent iff•P(A ∧ B) = P(A) ·P(B)•P(A | B) = P(A)• P(B | A) = P(B)•A and B are conditionally independent given C iff• P(A | B, C) = P(A | C)• P(B | A, C) = P(B | C)•P(A ∧ B | C) = P(A | C) ·P(B | C)• Bayes’ Rule• P(A | B) = P(B | A) P(A) / P(B)• P(A | B, C) = P(B | A, C) P(A | C) / P(B | C)4Bayesian Networks•To do probabilistic reasoning, you need to know the joint probability distribution•But, in a domain with N propositional variables, one needs 2Nnumbers to specify the joint probability distribution•We want to exploit independences in the domain•Two components: structure and numerical parameters5Bayesian (Belief) Networks• Set of random variables, each has a finite set of values• Set of directed arcs between them forming acyclic graph, representing causal relation•Every node A, with parents B1, …, Bn, has P(A | B1,…,Bn) specified6Key Advantage•The conditional independencies (missing arrows) mean that we can store and compute the joint probability distribution more efficientlyHow to design a Belief Network?•Explore the causal relations7Icy Roads“Causal” ComponentHolmes CrashIcyWatson CrashInspector Smith is waiting for Holmes and Watson, who are driving (separately) to meet him. It is winter. His secretary tells him that Watson has had an accident. He says, “It must be that the roads are icy. I bet that Holmes will have an accident too. I should go to lunch.” But, his secretary says, “No, the roads are not icy, look at the window.” So, he says, “I guess I better wait for Holmes.”8Icy Roads“Causal” ComponentHolmes CrashIcyWatson CrashInspector Smith is waiting for Holmes and Watson, who are driving (separately) to meet him. It is winter. His secretary tells him that Watson has had an accident. He says, “It must be that the roads are icy. I bet that Holmes will have an accident too. I should go to lunch.” But, his secretary says, “No, the roads are not icy, look at the window.” So, he says, “I guess I better wait for Holmes.”9IcyIcy Roads“Causal” ComponentHolmes CrashWatson CrashInspector Smith is waiting for Holmes and Watson, who are driving (separately) to meet him. It is winter. His secretary tells him that Watson has had an accident. He says, “It must be that the roads are icy. I bet that Holmes will have an accident too. I should go to lunch.” But, his secretary says, “No, the roads are not icy, look at the window.” So, he says, “I guess I better wait for Holmes.”210IcyIcy Roads“Causal” ComponentHolmes CrashWatson CrashInspector Smith is waiting for Holmes and Watson, who are driving (separately) to meet him. It is winter. His secretary tells him that Watson has had an accident. He says, “It must be that the roads are icy. I bet that Holmes will have an accident too. I should go to lunch.” But, his secretary says, “No, the roads are not icy, look at the window.” So, he says, “I guess I better wait for Holmes.”H and W are dependent, 11Icy Roads“Causal” ComponentHolmes CrashIcyWatson CrashInspector Smith is waiting for Holmes and Watson, who are driving (separately) to meet him. It is winter. His secretary tells him that Watson has had an accident. He says, “It must be that the roads are icy. I bet that Holmes will have an accident too. I should go to lunch.” But, his secretary says, “No, the roads are not icy, look at the window.” So, he says, “I guess I better wait for Holmes.”H and W are dependent, but conditionally independent given I12Holmes and Watson in IAHolmes and Watson have moved to IA. He wakes up to find his lawn wet. He wonders if it has rained or if he left his sprinkler on. He looks at his neighbor Watson’s lawn and he sees it is wet too. So, he concludes it must have rained.Holmes Lawn WetSprinklerWatson Lawn WetRain13Holmes and Watson in IAHolmes and Watson have moved to IA. He wakes up to find his lawn wet. He wonders if it has rained or if he left his sprinkler on. He looks at his neighbor Watson’s lawn and he sees it is wet too. So, he concludes it must have rained.Holmes Lawn WetSprinklerWatson Lawn WetRain14Holmes and Watson in IAHolmes and Watson have moved to IA. He wakes up to find his lawn wet. He wonders if it has rained or if he left his sprinkler on. He looks at his neighbor Watson’s lawn and he sees it is wet too. So, he concludes it must have rained.Holmes Lawn WetSprinklerWatson Lawn WetRain15Holmes and Watson in IAHolmes and Watson have moved to IA. He wakes up to find his lawn wet. He wonders if it has rained or if he left his sprinkler on. He looks at his neighbor Watson’s lawn and he sees it is wet too. So, he concludes it must have rained.Holmes Lawn WetSprinklerWatson Lawn WetRain16RainHolmes Lawn WetHolmes and Watson in IAHolmes and Watson have moved to IA. He wakes up to find his lawn wet. He wonders if it has rained or if he left his sprinkler on. He looks at his neighbor Watson’s lawn and he sees it is wet too. So, he concludes it must have rained.SprinklerWatson Lawn WetGiven W, P(R) goes up17RainHolmes Lawn WetHolmes and Watson in IAHolmes and Watson have moved to IA. He wakes up to find his lawn wet. He wonders if it has rained or if he left his sprinkler on. He looks at his neighbor Watson’s lawn and he sees it is wet too. So, he concludes it must have rained.SprinklerWatson Lawn WetGiven W, P(R) goes up and P(S) goes down –“explaining away”18Inference in Bayesian NetworksQuery TypesGiven a Bayesian network, what questions might we want to ask?•Conditional probability query: P(x | e)•Maximum a posteriori


View Full Document
Download Artificial Intelligence
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Artificial Intelligence and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Artificial Intelligence 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?