DOC PREVIEW
Pitt CS 2710 - Inference in Bayesian belief networks

This preview shows page 1-2-3-4-5 out of 15 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 15 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 15 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 15 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 15 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 15 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 15 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

1CS 2710 Foundations of AICS 2710 Foundations of AILecture 17Milos [email protected] Sennott SquareInference in Bayesian belief networks CS 2710 Foundations of AIModeling uncertainty with probabilities• Knowledge based system era (70s – early 80’s)– Extensional non-probabilistic models– Solve the space, time and acquisition bottlenecks in probability-based models – froze the development and advancement of KB systems and contributed to the slow-down of AI in 80s in general • Breakthrough (late 80s, beginning of 90s)– Bayesian belief networks• Give solutions to the space, acquisition bottlenecks• Partial solutions for time complexities• Bayesian belief network2CS 2710 Foundations of AIBayesian belief networks (BBNs)Bayesian belief networks.• Represent the full joint distribution over the variables more compactly with a smaller number of parameters. • Take advantage of conditional and marginal independencesamong random variables• A and B are independent• A and B are conditionally independent given C)()(),( BPAPBAP=)|()|()|,( CBPCAPCBAP=)|(),|( CAPBCAP=CS 2710 Foundations of AIBayesian belief networks (general)Two components:• Directed acyclic graph– Nodes correspond to random variables – (Missing) links encode independences• Parameters– Local conditional probability distributionsfor every variable-parent configuration))(|(iiXpaXPABMJE),(SSBΘ=)(iXpa- stand for parents of XiWhere:B E T FT T 0.95 0.05T F 0.94 0.06F T 0.29 0.71F F 0.001 0.999P(A|B,E)3CS 2710 Foundations of AIBayesian belief network.Burglary EarthquakeJohnCalls MaryCallsAlarmB E T FT T 0.95 0.05T F 0.94 0.06F T 0.29 0.71F F 0.001 0.999P(B)0.001 0.999P(E)0.002 0.998A T FT 0.90 0.1F 0.05 0.95A T FT 0.7 0.3F 0.01 0.99P(A|B,E)P(J|A)P(M|A)T F T FCS 2710 Foundations of AIFull joint distribution in BBNsFull joint distribution is defined in terms of local conditional distributions (obtained via the chain rule):))(|(),..,,(,..121∏==niiinXpaXXXX PPMABJE====== ),,,,( FMTJTATETBPExample:)|()|(),|()()( TAFMPTATJPTETBTAPTEPTBP =========Then its probability is:Assume the following assignmentof values to random variablesFMTJTATETB===== ,,,,4CS 2710 Foundations of AIBayesian belief networks (BBNs)Bayesian belief networks • Represent the full joint distribution over the variables more compactly using the product of local conditionals. • But how did we get to local parameterizations?Answer:• Graphical structure encodes conditional and marginal independences among random variables• A and B are independent• A and B are conditionally independent given C• The graph structure implies the decomposition !!!)()(),( BPAPBAP=)|()|()|,( CBPCAPCBAP=)|(),|( CAPBCAP=CS 2710 Foundations of AIIndependences in BBNs3 basic independence structures:BurglaryJohnCallsAlarmBurglaryAlarmEarthquakeJohnCallsAlarmMaryCalls1. 2. 3.5CS 2710 Foundations of AIIndependences in BBN• BBN distribution models many conditional independence relations among distant variables and sets of variables• These are defined in terms of the graphical criterion called d-separation• D-separation and independence– Let X,Y and Z be three sets of nodes– If X and Y are d-separated by Z, then X and Y are conditionally independent given Z• D-separation :– A is d-separated from B given C if every undirected path between them is blocked with C• Path blocking– 3 cases that expand on three basic independence structuresCS 2710 Foundations of AIIndependences in BBNs• Earthquake and Burglary are independent given MaryCalls F• Burglary and MaryCalls are independent (not knowing Alarm) F• Burglary and RadioReport are independent given Earthquake T• Burglary and RadioReport are independent given MaryCalls FBurglaryJohnCallsAlarmEarthquakeMaryCallsRadioReport6CS 2710 Foundations of AIBayesian belief networks (BBNs)Bayesian belief networks • Represents the full joint distribution over the variables more compactly using the product of local conditionals. • So how did we get to local parameterizations?• The decomposition is implied by the set of independences encoded in the belief network.))(|(),..,,(,..121∏==niiinXpaXXXX PPCS 2710 Foundations of AIFull joint distribution in BBNsMABJE====== ),,,,( FMTJTATETBP)()(),|()|()|( TEPTBPTETBTAPTAFMPTATJP ==========),,,(),,,|( FMTATETBPFMTATETBTJP ==========),,,()|( FMTATETBPTATJP=======),,(),,|( TATETBPTATETBFMP =======),,()|( TATETBPTAFMP=====),(),|( TETBPTETBTAP =====)()( TEPTBP ==Rewrite the full joint probability using the product rule:7CS 2710 Foundations of AIBayesian belief network.Burglary EarthquakeJohnCalls MaryCallsAlarmB E T FT T 0.95 0.05T F 0.94 0.06F T 0.29 0.71F F 0.001 0.999P(B)0.001 0.999P(E)0.002 0.998A T FT 0.90 0.1F 0.05 0.95A T FT 0.7 0.3F 0.01 0.99P(A|B,E)P(J|A)P(M|A)T FT F• In the BBN the full joint distribution is expressed using a set of local conditional distributionsCS 2710 Foundations of AI# of parameters of the full joint: Parameter complexity problem• In the BBN the full joint distribution is defined as:• What did we save?Alarm example: 5 binary (True, False) variablesBurglaryJohnCallsAlarmEarthquakeMaryCalls))(|(),..,,(,..121∏==niiinXpaXXXX PP3225=31125=−One parameter is for free:# of parameters of the BBN:20)2(2)2(2223=++10)1(2)2(222=++One parameter in every conditional is for free:8CS 2710 Foundations of AIModel acquisition problemThe structure of the BBN• typically reflects causal relations(BBNs are also sometime referred to as causal networks)• Causal structure is intuitive in many applications domain and itis relatively easy to define to the domain expertProbability parameters of BBN• are conditional distributions relating random variables and their parents • Complexity is much smaller than the full joint• It is much easier to obtain such probabilities from the expert or learn them automatically from dataCS 2710 Foundations of AIBBNs built in practice• In various areas:– Intelligent user interfaces (Microsoft)– Troubleshooting, diagnosis of a technical device– Medical diagnosis:• Pathfinder (Intellipath)•CPSC• Munin•QMR-DT– Collaborative filtering– Military applications– Business and finance• Insurance, credit applications9CS 2710 Foundations of AIInference•CS 2710


View Full Document

Pitt CS 2710 - Inference in Bayesian belief networks

Documents in this Course
Load more
Download Inference in Bayesian belief networks
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Inference in Bayesian belief networks and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Inference in Bayesian belief networks 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?