# K-State CIS 798 - Multimodal Information Access and Synthesis (12 pages)

Previewing pages*1, 2, 3, 4*of 12 page document

**View the full content.**## Multimodal Information Access and Synthesis

Previewing pages
*1, 2, 3, 4*
of
actual document.

**View the full content.**View Full Document

## Multimodal Information Access and Synthesis

0 0 30 views

- Pages:
- 12
- School:
- Kansas State University
- Course:
- Cis 798 - Top/Computer Science - Top/Cyber Defense Basics

**Unformatted text preview:**

Data Sciences Summer Institute Multimodal Information Access and Synthesis Learning and Reasoning with Graphical Models of Probability for the Identity Uncertainty Problem William H Hsu Tuesday 29 May 2007 Laboratory for Knowledge Discovery in Databases Kansas State University http www kddresearch org KSU CIS DSSI MIAS SRL 20070529 ppt DSSI MIAS University of Illinois at Urbana Champaign Computing Information Sciences Kansas State University Part 1 of 8 Graphical Models Intro Overview Graphical Models of Probability Markov graphs Bayesian belief networks Causal semantics Direction dependent separation d separation property Learning and Reasoning Problems Algorithms Inference exact and approximate Junction tree Lauritzen and Spiegelhalter 1988 Bounded loop cutset conditioning Horvitz and Cooper 1989 Variable elimination Dechter 1996 Structure learning K2 algorithm Cooper and Herskovits 1992 Variable ordering problem Larannaga 1996 Hsu et al 2002 Probabilistic Reasoning in Machine Learning Data Mining Current Research and Open Problems Computing Information Sciences Kansas State University Stages of Data Mining Adapted from Fayyad Piatetsky Shapiro and Smyth 1996 Computing Information Sciences Kansas State University Graphical Models Defined 1 Independence and Bayes Nets Conditional Independence X is conditionally independent CI from Y given Z sometimes written X Y Z iff P X Y Z P X Z for all values of X Y and Z Example P Thunder Rain Lightning P Thunder Lightning T R L Bayesian Belief Network Acyclic directed graph model B V E representing CI assertions over Vertices nodes V denote events each a random variable Edges arcs links E denote conditional dependencies Markov Condition for BBNs Chain Rule Example BBN Age X1 Exposure To Toxins X3 n P X 1 X 2 X n P X i parents X i i 1 Serum Calcium Cancer X6 X5 Gender X2 X4 Smoking X7 Lung Tumor Descendants Non Descendants Parents P 20s Female Low Non Smoker No Cancer Negative Negative P T P F P L T P N T F P N L N P N N P N N Computing Information Sciences Kansas State University Graphical Models Defined 2 D Separation and Markov Blankets Motivation The conditional independence status of nodes within a BBN might change as the availability of evidence E changes Direction dependent separation d separation is a technique used to determine conditional independence of nodes as evidence changes Definition A set of evidence nodes E d separates two sets of nodes X and Y if every undirected path from a node in X to a node in Y is blocked given E A path is blocked if one of three conditions holds X E 1 Z 2 Z 3 Z From S Russell P Norvig 1995 Y Adapted from J Schlabach 1996 Computing Information Sciences Kansas State University Graphical Models Defined 3 Reasoning with Bayes Nets Multiply connected case exact approximate inference are complete Adapted from slides by S Russell UC Berkeley http aima cs berkeley edu Computing Information Sciences Kansas State University Bayesian Network Applications 1 Time Series Prediction Goal Estimate P X i y t 1 r Filtering r t Intuition infer current state from observations Adapted from Murphy 2001 Guo 2002 Applications signal identification Variation Viterbi algorithm Prediction r t Intuition infer future state Applications prognostics Smoothing r t Intuition infer past hidden state Applications signal enhancement CF Tasks Plan recognition by smoothing Prediction cf WebCANVAS Cadez et al 2000 Computing Information Sciences Kansas State University Bayesian Network Applications 2 Bayes Optimal Classification General Case BBN Structure Learning Use Inference to Compute Scores Optimal Strategy Bayesian Model Averaging Assumption models h H are mutually exclusive and exhaustive Combine predictions of models in proportion to marginal likelihood Compute conditional probability of hypothesis h given observed data D i e compute expectation over unknown h for unseen cases Let h structure parameters CPTs P x m 1 D P x 1 x 2 x n x 1 x 2 x m P x m 1 D h P h D h H Posterior Score Marginal Likelihood Prior over Parameters P h D P D h P h P h P D h P h d Prior over Structures Likelihood Computing Information Sciences Kansas State University Inference in Bayesian Networks Loop Cutset Conditioning Age 0 10 Split vertex in undirected cycle condition upon each of its state values X1 1 Age 10 20 X1 2 Exposure ToToxins X3 Age 100 X1 10 X2 Serum Calcium Cancer X6 X5 X4 Smoking X7 Lung Tumor Gender Deciding Optimal Cutset NP hard Current Open Problems Bounded cutset conditioning ordering heuristics Finding randomized algorithms for loop cutset optimization Number of network instantiations Product of arity of nodes in minimal loop cutset Posterior marginal conditioned upon cutset variable values Computing Information Sciences Kansas State University Novel Contributions 3 Learning in Graphical Models Dynamic Bayes Net for Prediction Continuing Work Speeding up Approximate Inference using Edge Deletion J Thornton 2005 Bayesian Network tools in Java BNJ v4 W Hsu J M Barber J Thornton 2006 DSSI MIAS University of Illinois at Urbana Champaign Computing Information Sciences Kansas State University Bayesian Network tools in Java BNJ v4 2005 KSU Bayesian Network tools in Java BNJ Development Team DSSI MIAS University of Illinois at Urbana Champaign ALARM Network Computing Information Sciences Kansas State University Questions and Discussion DSSI MIAS University of Illinois at Urbana Champaign Computing Information Sciences Kansas State University

View Full Document