CS188 Fall 2011 Section 8: Inference and Sampling1 Sampling and DBNsThe diagram to the right describes a person’s ice-cream eat-ing habits based on the weather. The nodes Wistand for theweather on a day i, which can either be rainy R or sunny S. Thenodes Iirepresent whether or not the person ate ice-cream onday i, and the node takes values T (for truly eating ice cream)or F. The conditional probability distributions relevant to thegraphical model are also given to you.Suppose we sample from the prior to produce the following samples of (W1, I1, W2, I2) from the ice-cream model:R, F, R, F R, F, R, F S, F, S, T S, T, S, T S, T, R, FR, F, R, T S, T, S, T S, T, S, T S, T, R, F R, F, S, T(a) What isbP (W2= R), the probability sampling assigns the event W2= R?(b) Cross off samples rejected by rejection sampling if we’re computing P(W2|I1= T, I2= F)Rejection sampling seems to be wasting a lot of effort, so we decide to switch to likelihood weighting. Assume wegenerate the following six samples given the evidence I1= T and I2= F:(W1, I1, W2, I2) =nhS, T, R, Fi, hR, T, R, Fi, hS, T, R, Fi, hS, T, S, Fi, hS, T, S, Fi, hR, T, S, Fio(c) What is the weight of the first sample (S, T, R, F) above?(d) Use likelihood weighting to estimatebP (W2|I1= T, I2= F)12 Basic InferenceThe next parts involve computing various quantities in the networks below. These questions are designed so thatthey can be answered with a minimum of computation. If you find yourself doing a copious amount of computationfor each part, step back and consider whether there is simpler way to deduce the answer.Consider the simple Bayes Net below.(a) P (+b)(b) P (+a| + b)Now consider a more complicated Bayes Net shown below.(c) P (+a, ¬b, +c, ¬d)2(d) P (+d| + a)(e) P (+d| + a,
View Full Document