DOC PREVIEW
CMU CS 10708 - Structure Learning in BNs 3

This preview shows page 1-2-17-18-19-35-36 out of 36 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 36 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 36 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 36 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 36 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 36 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 36 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 36 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 36 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Structure Learning in BNs 3: (the good, the bad,) and, finally, the ugly Inference we now get to use the BNs!!Bayesian, a decomposable scoreStructure learning for general graphsUnderstanding score decompositionFixed variable order 1Fixed variable order 2Learn BN structure using local searchExploit score decomposition in local searchSome experimentsOrder search versus graph searchBayesian model averagingWhat you need to know about learning BN structuresAnnouncementsInference in graphical models: Typical queries 1Inference in graphical models: Typical queries 2 – MaximizationAre MPE and MAP Consistent?Complexity of conditional probability queries 1Complexity of conditional probability queries 2Inference is #P-hard, hopeless?Complexity for other inference questionsInference in BNs hopeless?General probabilistic inferenceMarginalizationProbabilistic inference exampleFast probabilistic inference example – Variable eliminationUnderstanding variable elimination – Exploiting distributivityUnderstanding variable elimination – Order can make a HUGE differenceUnderstanding variable elimination – Intermediate resultsUnderstanding variable elimination – Another examplePruning irrelevant variablesVariable elimination algorithmOperations on factorsSlide 33Complexity of VE – First analysisComplexity of variable elimination – (Poly)-tree graphsWhat you need to know about inference thus far1Structure Learning in BNs 3:(the good, the bad,) and, finally, the ugly Inferencewe now get to use the BNs!!Graphical Models – 10708Carlos GuestrinCarnegie Mellon UniversityOctober 6th, 2006Readings:K&F: 15.1, 15.2, 15.3, 15.4, 15.5K&F: 7 (overview of inference)K&F: 8.1, 8.2 (Variable Elimination)10-708 – Carlos Guestrin 20062 Bayesian, a decomposable scoreAs with last lecture, assume:Local and global parameter independenceAlso, prior satisfies parameter modularity:If Xi has same parents in G and G’, then parameters have same priorFinally, structure prior P(G) satisfies structure modularityProduct of terms over familiesE.g., P(G) / c|G|Bayesian score decomposes along families!10-708 –  Carlos Guestrin 20063 Structure learning for general graphsIn a tree, a node only has one parentTheorem:The problem of learning a BN structure with at most d parents is NP-hard for any (fixed) d¸2Most structure learning approaches use heuristicsExploit score decomposition(Quickly) Describe two heuristics that exploit decomposition in different ways10-708 –  Carlos Guestrin 20064 Understanding score decompositionDifficultySATGradeHappyJobCoherenceLetterIntelligence10-708 – Carlos Guestrin 20065 Fixed variable order 1Pick a variable order Áe.g., X1,…,XnXi can only pick parents in {X1,…,Xi-1}Any subsetAcyclicity guaranteed!Total score = sum score of each node10-708 – Carlos Guestrin 20066 Fixed variable order 2Fix max number of parents to kFor each i in order ÁPick PaXiµ{X1,…,Xi-1}Exhaustively search through all possible subsetsPaXi is maximum Uµ{X1,…,Xi-1} FamScore(Xi|U : D)Optimal BN for each order!!!Greedy search through space of orders:E.g., try switching pairs of variables in orderIf neighboring vars in order are switched, only need to recompute score for this pair O(n) speed up per iteration10-708 – Carlos Guestrin 20067 Learn BN structure using local searchStarting from Chow-Liu treeLocal search,possible moves:Only if acyclic!!!• Add edge• Delete edge• Invert edgeSelect using favorite score10-708 – Carlos Guestrin 20068 Exploit score decomposition in local searchAdd edge and delete edge:Only rescore one family!Reverse edgeRescore only two familiesDifficultySATGradeHappyJobCoherenceLetterIntelligence10-708 – Carlos Guestrin 20069 Some experimentsAlarm network10-708 – Carlos Guestrin 200610 Order search versus graph searchOrder search advantagesFor fixed order, optimal BN – more “global” optimizationSpace of orders much smaller than space of graphsGraph search advantagesNot restricted to k parentsEspecially if exploiting CPD structure, such as CSICheaper per iterationFiner moves within a graph10-708 – Carlos Guestrin 200611 Bayesian model averagingSo far, we have selected a single structureBut, if you are really Bayesian, must average over structuresSimilar to averaging over parametersInference for structure averaging is very hard!!!Clever tricks in reading10-708 – Carlos Guestrin 200612 What you need to know about learning BN structuresDecomposable scoresData likelihood Information theoretic interpretationBayesianBIC approximationPriorsStructure and parameter assumptionsBDe if and only if score equivalenceBest tree (Chow-Liu)Best TANNearly best k-treewidth (in O(Nk+1))Search techniquesSearch through ordersSearch through structuresBayesian model averaging10-708 – Carlos Guestrin 200613 AnnouncementsDon’t forget project proposals due this WednesdaySpecial recitation on advanced topic:Ajit Singh on Optimal Structure LearningOn Monday Oct 9, 5:30-7:00pm in Wean Hall 4615A10-708 – Carlos Guestrin 200614 Inference in graphical models: Typical queries 1FluAllergySinusHeadacheNoseConditional probabilitiesDistribution of some var(s). given evidence10-708 – Carlos Guestrin 200615 Inference in graphical models: Typical queries 2 – MaximizationFluAllergySinusHeadacheNoseMost probable explanation (MPE)Most likely assignment to all hidden vars given evidenceMaximum a posteriori (MAP)Most likely assignment to some var(s) given evidence10-708 – Carlos Guestrin 200616 Are MPE and MAP Consistent?Sinus NoseMost probable explanation (MPE)Most likely assignment to all hidden vars given evidenceMaximum a posteriori (MAP)Most likely assignment to some var(s) given evidenceP(S=t)=0.4 P(S=f)=0.6P(N|S)10-708 – Carlos Guestrin 200617 Complexity of conditional probability queries 1How hard is it to compute P(X|E=e)?Reduction – 3-SAT...)()(432321 XXXXXX10-708 – Carlos Guestrin 200618 Complexity of conditional probability queries 2How hard is it to compute P(X|E=e)? At least NP-hard, but even harder!10-708 – Carlos Guestrin 200619 Inference is #P-hard, hopeless?Exploit structure!Inference is hard in general, but easy for many (real-world relevant) BN


View Full Document

CMU CS 10708 - Structure Learning in BNs 3

Documents in this Course
Lecture

Lecture

15 pages

Lecture

Lecture

25 pages

Lecture

Lecture

24 pages

causality

causality

53 pages

lecture11

lecture11

16 pages

Exam

Exam

15 pages

Notes

Notes

12 pages

lecture

lecture

18 pages

lecture

lecture

16 pages

Lecture

Lecture

17 pages

Lecture

Lecture

15 pages

Lecture

Lecture

17 pages

Lecture

Lecture

19 pages

Lecture

Lecture

42 pages

Lecture

Lecture

16 pages

r6

r6

22 pages

lecture

lecture

20 pages

lecture

lecture

35 pages

Lecture

Lecture

19 pages

Lecture

Lecture

21 pages

lecture

lecture

21 pages

lecture

lecture

13 pages

review

review

50 pages

Semantics

Semantics

30 pages

lecture21

lecture21

26 pages

MN-crf

MN-crf

20 pages

hw4

hw4

5 pages

lecture

lecture

12 pages

Lecture

Lecture

25 pages

Lecture

Lecture

25 pages

Lecture

Lecture

14 pages

Lecture

Lecture

15 pages

Load more
Download Structure Learning in BNs 3
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Structure Learning in BNs 3 and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Structure Learning in BNs 3 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?