Probablistic Graphical Models, Spring 2007Homework 1Due at the beginni ng of class on 10/08/07InstructionsThere are six questions in this homework. The last question involves programming which should be donein MATLAB. Do not attach your code to the writeup. Make a tarball of your code and output, name it<userid>.tgz where your userid is your CS or Andrew id, and copy it to /afs/cs/ac ademic/c lass/10708-f07/hw1/<userid> .If you are not submitting this from an SCS machine, you might need to authenticate yourself first. Seehttp://www.cs.cmu.edu/∼help/afs/cross realm.html for instructions. If you are not a CMU studentand don’t have a CS or Andrew id, email your submission to [email protected] are allowed to use any of (and only) the material distributed in class for the homeworks. This includesthe slides and the handouts given in the class1. Refer to the web page for policies regarding collaboration,due dates, and extensions.1 [6 pts] RepresentationConsider N+1 binary random variables X1. . . XN, Y that have the following conditional independencies:∀i, j, Xi⊥Xj|Y1. Suppose you wish to store the joint probability distribution of these N+1 variables as a single table.How many parameters will you need, to represent this table?2. Draw a graphical model over these N+1 variables that encodes the conditional independencies above.How many parameters will you need to completely describe the distribution using this representation?2 [10 pts] Gaussians1. Show that for any joint density function P (X, Y ), if we have X⊥Y , then Cov(X, Y ) = 0.2. Show that if P (X, Y ) is a Gaussian distribution and Cov(X, Y ) = 0, then X⊥Y .3 [15 pts] Undirected Graphi cal ModelsA BN G is said to preserve the dependencies of a Markov Network H if whenever G implies X⊥Y |Z, H alsoimplies it. Provide an example of a Markov network, which, when transformed to a dependency-preservingBayesian network, grows exponentially, ie the size of the largest clique in H is constant, yet any BN I-mapis exponentially large2in the numbe r of nodes n.1Please contact Monica Hopes(meh@cs)if you need a copy of a han dou t.2It will therefore suffice to show an example H , where any BN that is an I-map of H has atleast cnkparameters for anyreal constants c > 1, k > 0.1Figure 1: Bayesian network for Problem 44 [15 pts] EliminationConsider the Bayesian Network given in Fig 1.1. For each of the following assertions of (conditional) independence, state if they are True or False withjustification.(a) X7⊥X9(b) X10⊥X7|X4(c) X4⊥X8|X1(d) X4⊥X3|X1(e) X1⊥X5|X3(f) X1⊥X5|X6(g) X9⊥X7|X62. For the elimination order X7, X10, X4, X9, X5, X8, X1, X2, X3, draw the moralized graph and the fill-inedges created in the process of elimination.3. For the elimination order X6, X9, X4, X8, X7, X10, X1, X2, X5, draw the resulting junction tree.2Figure 2: Bayesian network for Problem 55 [20 pts] Arc Reversal1. Consider the Bayesian Ne twork in Fig 2. Suppose we wish to reverse arc B → A. What additionalmodifications to the structure of the network are needed to ensure that the new network is an I-mapof the original distribution? Your network should not reverse any additional edges, and should onlydiffer minimally from the original in terms of the number of edge additions or deletions. Justify yourresponse.2. Now, consider a Bayesian Network G. As sume that X → Y is the only directe d trail from X to Y .Define a general procedure to reverse the arc X → Y , i.e., for constructing a graph G0that is an I-mapfor the original distribution but that contains an arc Y → X and otherwise differs minimally from Gin the same sense as above. Once again, you must provide a justification for your response.3. Suppose that we use the above procedure to transform G into G0and then, reverse the arc back to itsoriginal direction by repeating the above transformation to obtain graph G00. Is G = G00always?6 [30 pts] Belief Propagati onCecilia and Slackernerney, two graduate students in Markovford, have gotten into an argument over theweather. Cecilia thinks summer has ended and that we are into Autumn(Fall), while Slackenerney (who hasbeen spending all his time in a ro om without windows) thinks that this is summer. Since they weren’t ableto res olve it themselves, they have decided to ask the fine students at CMU to help them decide.A bit about Markovford: there are three seasons in Markovford – Summer(S), Autumn(A) and Win-ter(W). Given the season the previous day, the season on a day is conditionally independent of the seasonon all previous days. The weather is either Hot(H), Rainy(R) or Freezing(F). Given the season on any givenday, the weather that day is independent of all other variables.More formally, if we let Cidenote the season on the ithday (taking values S,A,W) and Oidenote theobserved weather pattern(one of H,R,F); we have ∀j < i − 1 Ci⊥Cj|Ci−1and ∀X, Oi⊥X|Ciwhere X is anyrandom variable other than Ci, Oi.1. Draw a graphical mo del over C1. . . CN, O1. . . ONthat satisfies the conditional independencies listedabove.2. Implement sum-product and max-product algorithms in MATLAB, for this graphical model. In orderto fully specify the graphical model, you will need to fill the values for the CPT. Some of these are3listed in params.txt Assume a uniform distribution for any additional CPTs that you think you needto fully specify your graphical model (Hint: You will need one additional CPT).3. Cecilia has made 20 observation of the weather over the last few months(ie O1. . . ON). This data isin weather.txt.(a) Apply inference, to compute the most likely season for each of these days and submit the outputin a textfile called hw1-a.txt(b) Apply inference to determine the most likely sequence of C1. . . CNthat generated this observedsequence. Submit your output in a textfile called hw1-b.txt(c) Cecilia made the last observation today. Who do you think was correct? What is the probabilitythat it is currently winter in Markovford?Submit your code and a readme describing briefly what each file does, along with the output listed
View Full Document