New version page

# Berkeley CS 188 - Exam Prep 10 Solutions

Pages: 5
Documents in this Course

## This preview shows page 1-2 out of 5 pages.

View Full Document
Do you want full access? Go Premium and unlock all 5 pages.
Do you want full access? Go Premium and unlock all 5 pages.

Unformatted text preview:

CS 188Fall 2019 Exam Prep 10 SolutionsQ1. Planning ahead with HMMsPacman is tired of using HMMs toestimate the location of ghosts. Hewants to use HMMs to plan what ac-tions to take in order to maximizehis utility. Pacman uses the HMM(drawn to the right) of length T tomodel the planning problem. In theHMM, X1:Tis the sequence of hiddenstates of Pacman’s world, A1:Tare ac-tions Pacman can take, and Utis theutility Pacman receives at the particu-lar hidden state Xt. Notice that thereare no evidence variables, and utilitiesare not discounted.. . .Xt−1XtXt+1. . .. . .Ut−1UtUt+1At−1AtAt+1. . .. . .. . .(a) The belief at time t is defined as Bt(Xt) = p(Xt|a1:t). The forward algorithm update has the followingform:Bt(Xt) = (i) (ii) Bt−1(xt−1).Complete the expression by choosing the option that fills in each blank.(i) # maxxt−1 Pxt−1# maxxt#Pxt# 1(ii) # p(Xt|xt−1) # p(Xt|xt−1)p(Xt|at) # p(Xt) p(Xt|xt−1, at) # 1# None of the above combinations is correctBt(Xt) = p(Xt|a1:t)=Xxt−1p(Xt|xt−1, at)p(xt−1|a1:t−1)=Xxt−1p(Xt|xt−1, at)Bt−1(xt−1)(b) Pacman would like to take actions A1:Tthat maximizes the expected sum of utilities, which has thefollowing form:MEU1:T= (i) (ii) (iii) (iv) (v)Complete the expression by choosing the option that fills in each blank.(i) maxa1:T# maxaT#Pa1:T#PaT# 1(ii) # maxt#QTt=1 PTt=1# mint# 1(iii) #Pxt,at Pxt#Pat#PxT# 1(iv) # p(xt|xt−1, at) # p(xt) Bt(xt) # BT(xT) # 1(v) # UT#1Ut#1UT Ut# 1# None of the above combinations is correct1MEU1:T= maxa1:TTXt=1XxtBt(xt)Ut(xt)(c) A greedy ghost now offers to tell Pacman the values of some of the hidden states. Pacman needs yourhelp to figure out if the ghost’s information is useful. Assume that the transition function p(xt|xt−1, at)is not deterministic. With respect to the utility Ut, mark all that can be True: VPI(Xt−1|Xt−2) > 0 VPI(Xt−2|Xt−1) > 0  VPI(Xt−1|Xt−2) = 0  VPI(Xt−2|Xt−1) =0 None of the aboveIt is always possible that VPI = 0. Can guarantee VPI(E|e) is not greater than 0 if E is independent ofparents(U) given e.(d) Pacman notices that calculating the beliefs under this model is very slow using exact inference. Hetherefore decides to try out various particle filter methods to speed up inference. Order the followingmethods by how accurate their estimate of BT(XT) is? If different methods give an equivalently accurateestimate, mark them as the same number.MostaccurateLeastaccurateExact inference 1 # 2 # 3 # 4Particle filtering with no resampling # 1 2 # 3 # 4Particle filtering with resampling before every time elapse # 1 # 2 # 3 4Particle filtering with resampling before every other time elapse # 1 # 2 3 # 4Exact inference will always be more accurate than using a particle filter. When comparing the particlefilter resampling approaches, notice that because there are no observations, each particle will have weight1. Therefore resampling when particle weights are 1 could lead to particles being lost and hence provebad.2Q2. Naive Bayes: Pacman or Ghost?You are standing by an exit as either Pacmen or ghosts come out of it. Every time someone comes out, you gettwo observations: a visual one and an auditory one, denoted by the random variables Xvand Xa, respectively.The visual observation informs you that the individual is either a Pacman (Xv= 1) or a ghost (Xv= 0). Theauditory observation Xais defined analogously. Your observations are a noisy measurement of the individual’strue type, which is denoted by Y . After the indiviual comes out, you find out what they really are: either aPacman (Y = 1) or a ghost (Y = 0). You have logged your observations and the true types of the first 20individuals:individual i0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19first observation X(i)v0 0 1 0 1 0 0 1 1 1 0 1 1 0 1 1 1 0 0 0second observation X(i)a0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0individual’s type Y(i)0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 0 0 0The superscript (i) denotes that the datum is the ith one. Now, the individual with i = 20 comes out, and youwant to predict the individual’s type Y(20)given that you observed X(20)v= 1 and X(20)a= 1.(a) Assume that the types are independent, and that the observations are independent conditioned on thetype. You can model this using na¨ıve Bayes, with X(i)vand X(i)aas the features and Y(i)as the labels.Assume the probability distributions take on the following form:P (X(i)v= xv|Y(i)= y) =(pvif xv= y1 − pvif xv6= yP (X(i)a= xa|Y(i)= y) =(paif xa= y1 − paif xa6= yP (Y(i)= 1) = qfor pv, pa, q ∈ [0, 1] and i ∈ N.X(i)vX(i)aY(i)(i) What’s the maximum likelihood estimate of pv, paand q?pv=45pa=35q =12To estimate q, we count 10 Y = 1 and 10 Y = 0 in the data. For pv, we have pv= 8/10 cases whereXv= 1 given Y = 1 and 1 − pv= 2/10 cases where Xv= 1 given Y = 0. So pv= 4/5. For pa, wehave pa= 2/10 cases where Xa= 1 given Y = 1 and 1 − pv= 0/10 cases where Xv= 1 given Y = 0.The average of 2/10 and 1 is 3/5.(ii) What is the probability that the next individual is Pacman given your observations? Express youranswer in terms of the parameters pv, paand q (you might not need all of them).P (Y(20)= 1|X(20)v= 1, X(20)a= 1) =pvpaqpvpaq +(1−pv)(1−pa)(1−q)The joint distribution P (Y = 1, Xv= 1, Xa= 1) = pvpaq. For the denominator, we need to sum outover Y , that is, we need P (Y = 1, Xv= 1, Xa= 1) + P (Y = 0, Xv= 1, Xa= 1).3Now, assume that you are given additional information: you are told that the individuals are actually comingout of a bus that just arrived, and each bus carries exactly 9 individuals. Unlike before, the types of every 9consecutive individuals are conditionally independent given the bus type, which is denoted by Z. Only afterall of the 9 individuals have walked out, you find out the bus type: one that carries mostly Pacmans (Z = 1)or one that carries mostly ghosts (Z = 0). Thus, you only know the bus type in which the first 18 individualscame in:individual i 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19first observation X(i)v0 0 1 0 1 0 0 1 1 1 0 1 1 0 1 1 1 0 0 0second observation X(i)a0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0individual’s type Y(i)0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 0 0 0bus j 0 1bus type Z(j)0 1(b) You can model this using a variant of na¨ıve bayes, where now 9 consecutive labels Y(i), . . . , Y(i+8)areconditionally independent given the bus type Z(j), for bus j and individual i = 9j. Assume the probabilitydistributions take on the following form:P (X(i)v=

View Full Document Unlocking...