This preview shows page 1-2 out of 5 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 5 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 5 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 5 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

15-381 Artificial Intelligence: Representation and Problem SolvingHomework 2Out: 09/18/08Due: 10/02/08InstructionsThis assignment is due on Thursday, October 2, 2008. The written portion and societal question must beturned in at the be ginning of class at noon on October 2. Type or write legibly; illegible submissions willnot rece ive credit. Write your name and Andrew ID clearly at the top of the assignment. The program-ming portion of this assignment must be electronically submitted by noon on October 2. The submissioninstructions are described in the Programming section. You are to complete this ass ignment individually.However, you are encouraged to discuss the general algorithms and ideas in the class in order to help eachother answer homework questions. You are also welcome to give each other examples that are not in theassignment in order to demonstrate how to solve problems. But we require you to:• not explicitly tell each other answers;• not copy answers;• not allow your answers to be copied.In those cases where you work with one or more other people on the general discussion of the assignment andsurrounding topics, we ask that you specifically record on the assignme nt the names of the people you werein discussion with (or “none” if you did not talk to anyone else). This will help resolve the situation wherea mistake in general discussion led to a replicated error among multiple solutions. This policy has beenestablished in order to be fair to everyone in the class. We have a grading policy of watching for cheatingand we will follow up if it is detected. Instructions for turning in programming component: The last questioninvolves programming which should be done in MATLAB. Do not attach your code to the writeup. Makea tarball of your code and output, name it <userid>.tgz where your userid is your Andrew id, and copyit to /afs/andrew/course/15/381-f08/submit/hw2/<userid>/ . Refer to the web page for policies regardingcollaboration, due dates, and extensions.1 [10 pts] Probability miscellanyCalvin wants to choose between his two pet activities: playing with his pet tiger Hobbes in the garden, andtormenting his mom in the kitchen. He wants to choose uniformly at random between the two activitiesand decides to toss a coin to decide. Since he is a kid and is broke, he gets a coin from his mom to do this.However, he isn’t sure if the coin is unbiased. Can you help him “simulate” a fair coin toss by using thiscoin? Explain how you would do this, and show that your procedure is indeed unbiased. While you don’tknow the bias of the coin, you may assume that the bias remains the same at all time. Some clarifications onterminology: a coin is considered “fair” or unbiased if p, the probability of heads equals 0.5. The questionasks you to describe a procedure toss unbiased coin() that returns “Heads” or “Tails” with probability0.5. The only source of randomness that the procedure has access to, is a procedure toss biased coin()1Figure 1: Bayesian network for Problem 3that returns a “Heads” with some unknown probability p. Note: You aren’t allowed to use any source ofrandomness other than the coin itself, so “call rand in matlab” is not a valid answer. :)2 [10 pts] RepresentationSupp ose we have a boolean variable X. To complete describe the distribution P (X), we need to specify onevalue: P(X=0)(since P(X=1) is s imply 1-P(X=0)) Thus, we say, this distribution can be characterized withone parameter. Now, consider N+1 binary random variables X1. . . XN, Y that factorize according to Fig. 11. Suppose you wish to s tore the joint probability distribution of these N+1 variables as a single table.How many parameters will you need to represent this table?2. Now, suppose you were to utilize the fact the joint distribution factorizes acco ording to the BayesNetwork. How many parameters will you need to completely describe the distribution if you use theBayesian Network representation? In other words, how many parameters will you need to fully specifythe values of all the conditional probability tables in this Bayesian Network.3 [15 pts] Number of BNsWhat is the maximum number of edges in a Bayesian network (BN) with n nodes? Prove that a validBN containing this number of edges can be constructed (remember that the structure of a BN has to be aDirected Acyclic Graph)4 [15 pts] Conditional Independencies in Bayes Nets(i) (ii) (iii)Figure 2: Parts of the alarm networkThe Bayesian networks in Fig. 2 are all part of the alarm network introduced in class and in Russell andNorvig. We use the notation X⊥Y to denote the variable X being independent of Y , and X⊥Y |Z to denoteX being independent of Y given Z.1. For each of these three networks, write the factored joint distribution implied, in the form of p(X, Y ) =p(X)p(Y |X)22. Using the joint distribution you wrote down for Fig. 2(i), write down a formula for P (B, E).3. Now prove that B⊥E.4. Similarly, prove that B⊥M |A in the Bayesian network of Fig. 2(ii), and M⊥J|A in the Bayesiannetwork of Fig. 2(iii).5 [15 pts] Elimination(i) (ii)Figure 3: Bayesian networks for Problem 6Consider the Bayesian Network given in Fig. 3(i). Assume that each of the variables are boolean valued.For each of the following, state the total number of operations(multiplication and addition) the variableelimination algorithm will take to compute the answer. For example, if A and B are binary variables,PaP (B)P (a|B) will take two multiplications(P (B)∗P (A|B) and P (B)∗P (¬A|B)) and one addition(addingthose two terms). Assume that the algorithm avoids unnecessary computation, so any s ummations that areirrelevant to the query will be avoided(see the te xtbook for an example of an irrelevant summation).1. P (X3|X4) i.e., the probability that X3is true given that X4is true. Assume that the variables areeliminated in the order: X5, X1, X2.2. P (X3|X4) but this time, with the elimination ordering X2, X1, X53. Now suppose that you had to answer the last two questions, this time with the Bayes Net given inFig. 3(ii). Would your answers change?(You don’t have to state the number of operations; just if theywill be different from previously with a brief explanation)6 [35 pts] Inference by enumeration1. Implement exact inference by enumeration. Write a function that calculates the conditional probabilitydistribution of one query variable given a set of evidence variables in a Bayesian network. See Algorithm1 for the


View Full Document

CMU CS 15381 - Homework

Documents in this Course
Planning

Planning

19 pages

Planning

Planning

19 pages

Lecture

Lecture

42 pages

Lecture

Lecture

27 pages

Lecture

Lecture

19 pages

FOL

FOL

41 pages

lecture

lecture

34 pages

Exam

Exam

7 pages

Lecture

Lecture

22 pages

Handout

Handout

11 pages

Midterm

Midterm

14 pages

lecture

lecture

83 pages

Handouts

Handouts

38 pages

mdp

mdp

37 pages

HW2

HW2

7 pages

nn

nn

25 pages

lecture

lecture

13 pages

Handout

Handout

5 pages

Lecture

Lecture

27 pages

Lecture

Lecture

62 pages

Lecture

Lecture

5 pages

Load more
Download Homework
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Homework and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Homework 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?