DOC PREVIEW
Pitt CS 2750 - Bayesian belief networks

This preview shows page 1-2-3-4-5-6 out of 19 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 19 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 19 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 19 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 19 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 19 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 19 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 19 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

1CS 2750 Machine LearningCS 2750 Machine LearningLecture 15Milos [email protected] Sennott SquareBayesian belief networks.Inference.CS 2750 Machine LearningMidterm examWednesday, March 17, 2004• In class• Closed book• Material covered before Spring break• Last year midterm will be posted on the web2CS 2750 Machine LearningProject proposalsDue: Wednesday, March 24, 2004• 1-2 pages longProposal • Written proposal:1. Outline of a learning problem, type of data you have available. Why is the problem important?2. Learning methods you plan to try and implement for the problem. References to previous work.3. How do you plan to test, compare learning approaches4. Schedule of work (approximate timeline of work)• A 3-slide PPT presentation summarizing points 1-4CS 2750 Machine LearningModeling uncertainty with probabilities• Full joint distribution: joint distribution over all random variables defining the domain– it is sufficient to represent the complete domain and to do any type of probabilistic inferences Problems:– Space complexity. To store full joint distribution requires to remember numbers.n – number of random variables, d – number of values– Inference complexity. To compute some queries requires . steps. – Acquisition problem. Who is going to define all of the probability entries? )(dnO )(dnO3CS 2750 Machine LearningPneumonia example. Complexities.• Space complexity. – Pneumonia (2 values: T,F), Fever (2: T,F), Cough (2: T,F), WBCcount (3: high, normal, low), paleness (2: T,F)– Number of assignments: 2*2*2*3*2=48– We need to define at least 47 probabilities.• Time complexity.– Assume we need to compute the probability of Pneumonia=T from the full joint– Sum over 2*2*3*2=24 combinations== )( TPneumoniaP∑∑ ∑ ∑∈∈ = ∈=====FTiFTjlnhkFTuuPalekWBCcountjCoughiFeverP,, ,,,),,,(CS 2750 Machine LearningBayesian belief networks (BBNs)Bayesian belief networks.• Represent the full joint distribution over the variables more compactly with a smaller number of parameters. • Take advantage of conditional and marginal independencesamong random variables• A and B are independent• A and B are conditionally independent given C)()(),( BPAPBAP=)|()|()|,( CBPCAPCBAP=)|(),|( CAPBCAP=4CS 2750 Machine LearningBayesian belief network.Burglary EarthquakeJohnCallsMaryCallsAlarmP(B)P(E)P(A|B,E)P(J|A)P(M|A)1. Directed acyclic graph• Nodes = random variables• Links = missing links encode independences.CS 2750 Machine LearningBayesian belief network.2. Local conditional distributions • relate variables and their parentsBurglary EarthquakeJohnCallsMaryCallsAlarmP(B)P(E)P(A|B,E)P(J|A)P(M|A)5CS 2750 Machine LearningBayesian belief network.Burglary EarthquakeJohnCalls MaryCallsAlarmB E T FT T 0.95 0.05T F 0.94 0.06F T 0.29 0.71F F 0.001 0.999P(B)0.001 0.999P(E)0.002 0.998A T FT 0.90 0.1F 0.05 0.95A T FT 0.7 0.3F 0.01 0.99P(A|B,E)P(J|A)P(M|A)T F T FCS 2750 Machine LearningBayesian belief networks (general)Two components:• Directed acyclic graph– Nodes correspond to random variables – (Missing) links encode independences• Parameters– Local conditional probability distributionsfor every variable-parent configuration))(|(iiXpaXPABMJE),(SSBΘ=)(iXpa- stand for parents of XiWhere:B E T FT T 0.95 0.05T F 0.94 0.06F T 0.29 0.71F F 0.001 0.999P(A|B,E)6CS 2750 Machine LearningFull joint distribution in BBNsFull joint distribution is defined in terms of local conditional distributions (obtained via the chain rule):))(|(),..,,(,..121∏==niiinXpaXXXX PPMABJE====== ),,,,( FMTJTATETBPExample:)|()|(),|()()( TAFMPTATJPTETBTAPTEPTBP =========Then its probability is:Assume the following assignmentof values to random variablesFMTJTATETB===== ,,,,CS 2750 Machine LearningBayesian belief networks (BBNs)Bayesian belief networks • Represent the full joint distribution over the variables more compactly using the product of local conditionals. • But how did we get to local parameterizations?Answer:• Graphical structure encodes conditional and marginal independences among random variables• A and B are independent• A and B are conditionally independent given C• The graph structure implies the decomposition !!!)()(),( BPAPBAP=)|()|()|,( CBPCAPCBAP=)|(),|( CAPBCAP=7CS 2750 Machine LearningIndependences in BBNs3 basic independence structures:BurglaryJohnCallsAlarmBurglaryAlarmEarthquakeJohnCallsAlarmMaryCalls1. 2. 3.CS 2750 Machine LearningFull joint distribution in BBNsMABJE====== ),,,,( FMTJTATETBPRewrite the full joint probability using the product rule:8CS 2750 Machine LearningFull joint distribution in BBNsMABJE====== ),,,,( FMTJTATETBP),,,(),,,|( FMTATETBPFMTATETBTJP ==========),,,()|( FMTATETBPTATJP=======Rewrite the full joint probability using the product rule:CS 2750 Machine LearningFull joint distribution in BBNsMABJE====== ),,,,( FMTJTATETBP),,,(),,,|( FMTATETBPFMTATETBTJP ==========),,,()|( FMTATETBPTATJP=======),,(),,|( TATETBPTATETBFMP =======),,()|( TATETBPTAFMP=====Rewrite the full joint probability using the product rule:9CS 2750 Machine LearningFull joint distribution in BBNsMABJE====== ),,,,( FMTJTATETBP),,,(),,,|( FMTATETBPFMTATETBTJP ==========),,,()|( FMTATETBPTATJP=======),,(),,|( TATETBPTATETBFMP =======),,()|( TATETBPTAFMP=====),(),|( TETBPTETBTAP =====Rewrite the full joint probability using the product rule:CS 2750 Machine LearningFull joint distribution in BBNsMABJE====== ),,,,( FMTJTATETBP),,,(),,,|( FMTATETBPFMTATETBTJP ==========),,,()|( FMTATETBPTATJP=======),,(),,|( TATETBPTATETBFMP =======),,()|( TATETBPTAFMP=====),(),|( TETBPTETBTAP =====)()( TEPTBP ==Rewrite the full joint probability using the product rule:10CS 2750 Machine LearningFull joint distribution in BBNsMABJE====== ),,,,( FMTJTATETBP)()(),|()|()|( TEPTBPTETBTAPTAFMPTATJP ==========),,,(),,,|( FMTATETBPFMTATETBTJP ==========),,,()|( FMTATETBPTATJP=======),,(),,|( TATETBPTATETBFMP =======),,()|( TATETBPTAFMP=====),(),|( TETBPTETBTAP =====)()( TEPTBP ==Rewrite the full joint probability using the product rule:CS 2750 Machine LearningParameters:full joint: ?BBN: ?Parameter complexity problem• In the BBN the full joint distribution is expressed as a productof conditionals (of smaller) complexityBurglaryJohnCallsAlarmEarthquakeMaryCalls))(|(),..,,(,..121∏==niiinXpaXXXX PP11CS


View Full Document

Pitt CS 2750 - Bayesian belief networks

Download Bayesian belief networks
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Bayesian belief networks and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Bayesian belief networks 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?