DOC PREVIEW
Pitt CS 2710 - Inference in Bayesian belief networks

This preview shows page 1-2-22-23 out of 23 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 23 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 23 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 23 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 23 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 23 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

1CS 2710 Foundations of AICS 2710 Foundations of AILecture 18Milos [email protected] Sennott SquareInference in Bayesian belief networks CS 2710 Foundations of AITic-Tac-Toe competition • Three winners:2CS 2710 Foundations of AITic-Tac-Toe competition • Three winners:– Bryan Mills– Yaw Gyamfi– Lei JinCS 2710 Foundations of AIInference in Bayesian networks • BBN models compactly the full joint distribution by taking advantage of existing independences between variables• Simplifies the acquisition of a probabilistic model• But we are interested in solving various inference tasks:– Diagnostic task. (from effect to cause)– Prediction task. (from cause to effect)– Other probabilistic queries (queries on joint distributions).• Main issue: Can we take advantage of independences to construct special algorithms and speeding up the inference?)|( TJohnCallsBurglary=P)|( TBurglaryJohnCalls=P)( AlarmP3CS 2710 Foundations of AIInference in Bayesian network• Bad news: – Exact inference problem in BBNs is NP-hard (Cooper)– Approximate inference is NP-hard (Dagum, Luby)• But very often we can achieve significant improvements• Assume our Alarm network• Assume we want to compute:BurglaryJohnCallsAlarmEarthquakeMaryCalls)( TJP=CS 2710 Foundations of AIInference in Bayesian networksComputing:Approach 1. Blind approach.• Sum out all un-instantiated variables from the full joint, • express the joint distribution as a product of conditionalsComputational cost:Number of additions: ?Number of products: ?== )( TJP)()(),|()|()|(,,, ,eEPbBPeEbBaAPaAmMPaATJPFTbFTeFTaFTm==========∑∑∑∑∈∈∈ ∈),,,,(,,, ,mMTJaAeEbBPFTbFTeFTaFTm======∑∑∑∑∈∈∈ ∈)( TJP =4CS 2710 Foundations of AIInference in Bayesian networksComputing:Approach 1. Blind approach.• Sum out all un-instantiated variables from the full joint, • express the joint distribution as a product of conditionalsComputational cost:Number of additions: 15Number of products: ?== )( TJP)()(),|()|()|(,,, ,eEPbBPeEbBaAPaAmMPaATJPFTbFTeFTaFTm==========∑∑∑∑∈∈∈ ∈),,,,(,,, ,mMTJaAeEbBPFTbFTeFTaFTm======∑∑∑∑∈∈∈ ∈)( TJP =CS 2710 Foundations of AIInference in Bayesian networksComputing:Approach 1. Blind approach.• Sum out all un-instantiated variables from the full joint, • express the joint distribution as a product of conditionalsComputational cost:Number of additions: 15Number of products: 16*4=64== )( TJP)()(),|()|()|(,,, ,eEPbBPeEbBaAPaAmMPaATJPFTbFTeFTaFTm==========∑∑∑∑∈∈∈ ∈),,,,(,,, ,mMTJaAeEbBPFTbFTeFTaFTm======∑∑∑∑∈∈∈ ∈)( TJP =5CS 2710 Foundations of AIInference in Bayesian networksApproach 2. Interleave sums and products• Combines sums and product in a smart way (multiplications by constants can be taken out of the sum)Computational cost:Number of additions: 1+2*[1+1+2*1]=?Number of products: 2*[2+2*(1+2*1)]=?== )( TJP)](),|()[()|()|(,,. ,eEPeEbBaAPbBPaAmMPaATJPFTeFTbFTaFTm==========∑∑∑∑∈∈∈∈]])(),|()[()][|()[|(,,,,∑∑∑∑∈∈∈∈==========FTmFTbFTeFTaeEPeEbBaAPbBPaAmMPaATJP)()(),|()|()|(,,, ,eEPbBPeEbBaAPaAmMPaATJPFTbFTeFTaFTm==========∑∑∑∑∈∈∈ ∈CS 2710 Foundations of AIInference in Bayesian networksApproach 2. Interleave sums and products• Combines sums and product in a smart way (multiplications by constants can be taken out of the sum)Computational cost:Number of additions: 1+2*[1+1+2*1]=9Number of products: 2*[2+2*(1+2*1)]=?== )( TJP)](),|()[()|()|(,,. ,eEPeEbBaAPbBPaAmMPaATJPFTeFTbFTaFTm==========∑∑∑∑∈∈∈∈]])(),|()[()][|()[|(,,,,∑∑∑∑∈∈∈∈==========FTmFTbFTeFTaeEPeEbBaAPbBPaAmMPaATJP)()(),|()|()|(,,, ,eEPbBPeEbBaAPaAmMPaATJPFTbFTeFTaFTm==========∑∑∑∑∈∈∈ ∈6CS 2710 Foundations of AIInference in Bayesian networksApproach 2. Interleave sums and products• Combines sums and product in a smart way (multiplications by constants can be taken out of the sum)Computational cost:Number of additions: 1+2*[1+1+2*1]=9Number of products: 2*[2+2*(1+2*1)]=16== )( TJP)](),|()[()|()|(,,. ,eEPeEbBaAPbBPaAmMPaATJPFTeFTbFTaFTm==========∑∑∑∑∈∈∈∈]])(),|()[()][|()[|(,,,,∑∑∑∑∈∈∈∈==========FTmFTbFTeFTaeEPeEbBaAPbBPaAmMPaATJP)()(),|()|()|(,,, ,eEPbBPeEbBaAPaAmMPaATJPFTbFTeFTaFTm==========∑∑∑∑∈∈∈ ∈CS 2710 Foundations of AIInference in Bayesian networks• The smart interleaving of sums and products can help us to speed up the computation of joint probability queries• What if we want to compute:• A lot of shared computation– Smart cashing of results can save the time for more queries),( TJTBP===== ),( TJTBP∑∑∑∈∈∈==========FTmFTeFTaeEPeETBaAPTBPaAmMPaATJP,,,)(),|()()]|()[|(== )( TJP∑∑∑∑∈∈∈∈==========FTmFTbFTeFTaeEPeEbBaAPbBPaAmMPaATJP,,,,)(),|()()]|()[|(7CS 2710 Foundations of AIInference in Bayesian networks• The smart interleaving of sums and products can help us to speed up the computation of joint probability queries• What if we want to compute:• A lot of shared computation– Smart cashing of results can save the time if more queries),( TJTBP===== ),( TJTBP== )( TJP]])(),|()[()][|()[|(,,,,∑∑∑∑∈∈∈∈==========FTmFTbFTeFTaeEPeEbBaAPbBPaAmMPaATJP])](),|()[()][|()[|(,,,∑∑∑∈∈∈==========FTmFTeFTaeEPeETBaAPTBPaAmMPaATJPCS 2710 Foundations of AIInference in Bayesian networks• When cashing of results becomes handy?• What if we want to compute a diagnostic query:• Exactly probabilities we have just compared !!• There are other queries when cashing and ordering of sums and products can be shared and saves computation• General technique: Recursive decomposition)(),()|(TJPTJTBPTJTBP======),()(),()|( TJBTJPTJBTJB ====== PPPα8CS 2710 Foundations of AIInference in Bayesian networksGeneral idea: ]])(),|()[()][|()][|([,,,,,∑∑∑∑∑∈∈∈∈∈==========FTmFTbFTeFTaFTjeEPeEbBaAPbBPaAmMPaAjJP== 1)(TrueP)(afJ)(afM),( bafE)(afBAJMBERecursivedecomposition:Results cashed inthe tree structureCS 2710 Foundations of AIVariable elimination• Recursive decomposition:– Interleave sum and products before inference• Variable elimination:– Similar idea but interleave sum and products one variable at the time during inference– E.g. Query requires to eliminate A,B,E,M and this can be done in different order == )( TJP)()(),|()|()|(,,,


View Full Document

Pitt CS 2710 - Inference in Bayesian belief networks

Documents in this Course
Load more
Download Inference in Bayesian belief networks
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Inference in Bayesian belief networks and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Inference in Bayesian belief networks 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?