Unformatted text preview:

1Kansas State UniversityDepartment of Computing and Information SciencesCIS 830: Advanced Topics in Artificial IntelligenceMonday, March 27, 2000William H. HsuDepartment of Computing and Information Sciences, KSUhttp://www.cis.ksu.edu/~bhsuReadings:“Symbolic Causal Networks for Reasoning about Actions and Plans”,Darwiche and Pearl(Reference) Chapter 15, Russell and NorvigUncertain Reasoning Discussion (4 of 4);KDD and Data Mining OverviewLecture 27Lecture 27Kansas State UniversityDepartment of Computing and Information SciencesCIS 830: Advanced Topics in Artificial IntelligenceLecture OutlineLecture Outline• Readings– Chapter 15, Russell and Norvig– References• Chapters 14-17, Russell and Norvig• Chapter 6, Mitchell• Pearl and Verma paper• Tutorials (Heckerman, Friedman and Goldszmidt)• Bayesian Belief Networks (BBNs) Concluded– Inference: applying CPTs– Learning: CPTs from data, elicitation– In-class demo: Hugin (CPT elicitation, application)• Causal Discovery and BBN Structure Learning• KDD and Machine Learning Resources• Next Class: First KDD PresentationKansas State UniversityDepartment of Computing and Information SciencesCIS 830: Advanced Topics in Artificial IntelligenceBayesian Networks:Bayesian Networks: Quick Review Quick ReviewX1X2X3X4Season:SpringSummerFallWinterSprinkler: On, OffRain: None, Drizzle, Steady, DownpourGround-Moisture:Wet, DryX5Ground-Slipperiness:Slippery, Not-SlipperyP(Summer, Off, Drizzle, Wet, Not-Slippery) = P(S) · P(O | S) · P(D | S) · P(W | O, D) · P(N | W) • Recall: Conditional Independence (CI) Assumptions• Bayesian Network: Digraph Model– Vertices (nodes): denote events (each a random variable)– Edges (arcs, links): denote conditional dependencies• Chain Rule for (Exact) Inference in BBNs– Arbitrary Bayesian networks: NP-complete– Polytrees: linear time• Example (“Sprinkler” BBN)• MAP, ML Estimation over BBNs() ()()∏==niiin21Xparents |XPX , ,X,XP1K()h|DPmaxarghHhML∈=Kansas State UniversityDepartment of Computing and Information SciencesCIS 830: Advanced Topics in Artificial IntelligenceLearning Structure:Learning Structure:State Space Search and Causal DiscoveryState Space Search and Causal Discovery• Learning Structure: Beyond Trees– Problem not as easy for more complex networks• Example: allow two parents (even singly-connected case, aka polytree)• Greedy algorithms no longer guaranteed to find optimal network• In fact, no efficient algorithm exists– Theorem: finding network structure with maximal score, where H restricted toBBNs with at most k parents for each variable, is NP-hard for k > 1• Heuristic (Score-Based) Search of Hypothesis Space H– Define H: elements denote possible structures, adjacency relation denotestransformation (e.g., arc addition, deletion, reversal)– Traverse this space looking for high-scoring structures– Algorithms: greedy hill-climbing, best-first search, simulated annealing• Causal Discovery: Inferring Existence, Direction of Causal Relationships– Want: “No unexplained correlations; no accidental independencies” (cause ∧ CI)–Can discover causality from observational data alone?– What is causality anyway?Kansas State UniversityDepartment of Computing and Information SciencesCIS 830: Advanced Topics in Artificial IntelligenceHuginHugin Demo Demo•Hugin– Commercial product for BBN inference: http://www.hugin.com– First developed at University of Aalborg, Denmark• Applications– Popular research tool for inference and learning– Used for real-world decision support applications• Safety and risk evaluation: http://www.hugin.com/serene/• Diagnosis and control in unmanned subs: http://advocate.e-motive.com• Customer support automation: http://www.cs.auc.dk/research/DSS/SACSO/• Capabilities– Lauritzen-Spiegelhalter algorithm for inference (clustering aka clique reduction)– Object Oriented Bayesian Networks (OOBNs): structured learning and inference– Influence diagrams for decision-theoretic inference (utility + probability)– See: http://www.hugin.com/doc.htmlKansas State UniversityDepartment of Computing and Information SciencesCIS 830: Advanced Topics in Artificial IntelligenceIn-Class Exercise:In-Class Exercise:HuginHugin and CPT Elicitationand CPT Elicitation•Hugin Tutorials– Introduction: causal reasoning for diagnosis in decision support (toy problem)• http://www.hugin.com/hugintro/bbn_pane.html• Example domain: explaining low yield (drought versus disease)– Tutorial 1: constructing a simple BBN in Hugin• http://www.hugin.com/hugintro/bbn_tu_pane.html• Eliciting CPTs (or collecting from data) and entering them– Tutorial 2: constructing a simple influence diagram (decision network) in Hugin• http://www.hugin.com/hugintro/id_tu_pane.html• Eliciting utilities (or collecting from data) and entering them• Other Important BBN Resources– Microsoft Bayesian Networks: http://www.research.microsoft.com/dtas/msbn/– XML BN (Interchange Format): http://www.research.microsoft.com/dtas/bnformat/– BBN Repository (more data sets)http://www-nt.cs.berkeley.edu/home/nir/public_html/Repository/index.htm2Kansas State UniversityDepartment of Computing and Information SciencesCIS 830: Advanced Topics in Artificial IntelligenceBBayesian ayesian KKnowledge nowledge DDiscoverer iscoverer ((BKDBKD) Demo) Demo•Bayesian Knowledge Discoverer (BKD)– Research product for BBN structure learning: http://kmi.open.ac.uk/projects/bkd/– Bayesian Knowledge Discovery Project [Ramoni and Sebastiani, 1997]• Knowledge Media Institute (KMI), Open University, United Kingdom• Closed source, beta freely available for educational use– Handles missing data– Uses Branch and Collapse: Dirichlet score-based BOC approximation algorithmhttp://kmi.open.ac.uk/techreports/papers/kmi-tr-41.ps.gz• Sister Product: Robust Bayesian Classifier (RoC)– Research product for BBN-based classification with missing datahttp://kmi.open.ac.uk/projects/bkd/pages/roc.html– Uses Robust Bayesian Estimator, a deterministic approximation algorithmhttp://kmi.open.ac.uk/techreports/papers/kmi-tr-79.ps.gzKansas State UniversityDepartment of Computing and Information SciencesCIS 830: Advanced Topics in Artificial IntelligenceUsing ANN, BBN, GA, and ML Tools for KDDUsing ANN, BBN, GA, and ML Tools for KDD• Learning– Bayesian belief networks (BBNs)• R. Neal’s DELVE, MCMC library (University of Toronto)• Commercial tools:


View Full Document

K-State CIS 830 - Lecture 27

Download Lecture 27
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Lecture 27 and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Lecture 27 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?