Pitt CS 2750 - Learning Bayesian belief networks

Unformatted text preview:

1CS 2750 Machine LearningCS 2750 Machine LearningLecture 14Milos [email protected] Sennott SquareLearning Bayesian belief networksCS 2750 Machine LearningAdministrationMidterm: Monday, March 17, 2003• In class• Closed book• Material covered by Wednesday, March 12, including learning parameters of the BBNs but not the structure learning• Last year midterm is posted on the webNo new homework2CS 2750 Machine LearningLearning probability distributionBasic settings:• A set of random variables • A model of the distribution over variables in Xwith parameters • DataObjective: find parameters that describe the data the best Learning Bayesian belief networks:– parameterizations as defined by the structure of network},,,{21 nXXX K=XΘ},..,,{21 NDDDD =ΘˆCS 2750 Machine LearningLearning of BBNLearning.• Learning of parameters of conditional probabilities • Learning of the network structureVariables:• Observable – values present in every data sample• Hidden – they values are never observed in data• Missing values – values sometimes present, sometimes notNext: All variables are observable 1. Learning of parameters of BBN2. Learning of the model (BBN structure)3CS 2750 Machine LearningLearning of parameters of BBN• Idea: decompose the estimation problem for the full joint over a large number of variables to a set of smaller estimation problems corresponding to parent-variable conditionals. • Example: Assume A,E,B are binary with True, False values• Assumption that enables the decomposition: parameters of conditional distributions are independent B EAP(A|B=T,E=T)P(A|B,E)P(A|B=T,E=F)P(A|B=F,E=T)P(A|B=F,E=F)4 estimation problemsCS 2750 Machine LearningEstimates of parameters of BBN• Two assumptions that permit the decomposition:– Sample independence– Parameter independence∏∏===niqjijiDpDp11),|(),|(ξθξΘ∏==NuuDPDP1),|(),|(ξξΘΘParameters of each conditional (one for every assignment ofvalues to parent variables) can be learned independently4CS 2750 Machine LearningLearning of BBN parameters. Example.Example:PneumoniaCoughFeverPalenessHigh WBCP(Pneumonia)? ? T FPn T FT ? ?F ? ?P(HWBC|Pneum)P(Cough|Pneum)P(Fever|Pneum)P(Palen|Pneum)? ? ? CS 2750 Machine LearningLearning of BBN parameters. Example.Data D (different patient cases):Pal Fev Cou HWB PneuT T T T FT F F F FF F T T TF F T F TF T T T TT F T F FF F F F FT T F F FT T T T TF T F T TT F F T FF T F F FPneumoniaCoughFeverPalenessHigh WBC5CS 2750 Machine LearningEstimates of parameters of BBN• Much like multiple coin toss or roll of a dice problems. • A “smaller” learning problem corresponds to the learning of exactly one conditional distribution • Example:• Problem: How to pick the data to learn?)|( TPneumoniaFever =PCS 2750 Machine LearningEstimates of parameters of BBNMuch like multiple coin toss or roll of a dice problems. • A “smaller” learning problem corresponds to the learning of exactly one conditional distribution Example:Problem: How to pick the data to learn?Answer:1. Select data points with Pneumonia=T(ignore the rest)2. Focus on (select) only values of the random variable defining the distribution (Fever)3. Learn the parameters of the conditional the same way as we learned the parameters of the biased coin or dice)|( TPneumoniaFever =P6CS 2750 Machine LearningLearning of BBN parameters. Example.Learn:Step 1: Select data points with Pneumonia=TPal Fev Cou HWB PneuT T T T FT F F F FF F T T TF F T F TF T T T TT F T F FF F F F FT T F F FT T T T TF T F T TT F F T FF T F F F)|( TPneumoniaFever =PPneumoniaCoughFeverPalenessHigh WBCCS 2750 Machine LearningLearning of BBN parameters. Example.Learn:Step 1: Ignore the restPal Fev Cou HWB PneuF F T T TF F T F TF T T T TT T T T TF T F T T)|( TPneumoniaFever =PPneumoniaCoughFeverPalenessHigh WBC7CS 2750 Machine LearningLearning of BBN parameters. Example.Learn:Step 2: Select values of the random variable defining the distribution of FeverPal Fev Cou HWB PneuF F T T TF F T F TF T T T TT T T T TF T F T T)|( TPneumoniaFever =PPneumoniaCoughFeverPalenessHigh WBCCS 2750 Machine LearningLearning of BBN parameters. Example.Learn:Step 2: Ignore the restFevFFTTT)|( TPneumoniaFever =PPneumoniaCoughFeverPalenessHigh WBC8CS 2750 Machine LearningLearning of BBN parameters. Example.Learn:Step 3a: Learning the ML estimateFevFFTTT)|( TPneumoniaFever =P)|( TPneumoniaFever =P0.6 0.4 T FPneumoniaCoughFeverPalenessHigh WBCCS 2750 Machine LearningLearning of BBN parameters. Bayesian learning.Learn:Step 3b: Learning the Bayesian estimateAssume the priorFevFFTTTPosterior:)|( TPneumoniaFever =PPneumoniaCoughFeverPalenessHigh WBC)4,3(~|BetaTPneumoniaFever =θ)6,6(~|BetaTPneumoniaFever =θ9CS 2750 Machine LearningNaïve Bayes modelA special (simple) Bayesian belief network• used as a generative classifier model– Class variable Y– Attributes are independent given YLearning: ML, Bayesian estimates of parametersClassification: given x we need to determine the class– Choose the class with the maximum posteriorClass Y1X2XnX. .∑=======kjjYpjYpiYpiYpiYp1),|()|(),|()|(),|(ΘxΘΘxΘΘx),|(),|(1ijnjjiYxpiYp Θ===∏=ΘxCS 2750 Machine LearningNaïve Bayes with Gaussians distributionsYX)(YpGenerative classification model),...3(),2(),1( === YpYpYp1. Priors on classes−−−=−)()(21exp)2(1),|(12/12/jjTjjdjjp µxΣµxΣΣµxπBefore: Joint class conditional densities (for x)Now: Naïve Bayes - independent class conditional densities1X2XnX. .)(YpY)|( YpX)|(X1Yp−−=22)(21exp)2(1),|(jiijijijijiiµxµxpσσπσ),( Yp X10CS 2750 Machine LearningNaïve Bayes with Gaussians


View Full Document

Pitt CS 2750 - Learning Bayesian belief networks

Download Learning Bayesian belief networks
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Learning Bayesian belief networks and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Learning Bayesian belief networks 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?