DOC PREVIEW
UW-Madison COMPSCI 540 - Final Exam

This preview shows page 1-2-3 out of 9 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 9 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 9 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 9 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 9 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

CS 540: Introduction to Artificial IntelligenceFinal Exam: 2:45-4:45pm, May 10, 1999Room 168 NolandCLOSED BOOK(two sheets of notes and a calculator allowed)Write your answers on these pages and show your work. If you feel that a question is not fully specified,state any assumptions you need to make in order to solve the problem. You may use the backs of thesesheets for scratch work.Write your name on this and all other pages of this exam. Make sure your exam contains eight problemson nine pages.Name ________________________________________________________________Student ID ________________________________________________________________Problem Score Max Score 1 ______ 15 2 ______ 13 3 ______ 17 4 ______ 16 5 ______ 10 6 ______ 12 7 ______ 10 8 ______ 7 TOTAL ______ 1002Problem 1 – First-Order Predicate Calculus (15 points)Convert each of the following English sentences into First-Order Predicate Calculus, using reasonablynamed predicates, functions, and constants. If you feel a sentence is ambiguous, clarify which meaningyou’re representing in logic.Someone is taller than Mary.Red blocks weigh more than blue ones.Except for BillG, for every person there exists someone richer.If you give something of yours to someone else, they own it. [You must use situation calculus here.]Borrowing a book doesn’t change who owns it. [You must use situation calculus here.]3Problem 2 – Production Systems (13 points)Consider the production-system rules listed below. Variables are indicated by a leading ‘?’ and all otherterms are constants. The conflict-resolution strategy is to choose the rule with the most preconditions,with the extra constraint that no rule can be used more than once with the same variable bindings. Thepredicate‘≤ ’ is procedurally defined, and this production system uses negation-by-failure. (1) IF P(?x, ?y) ∧ Q(?x, a, ?z) ∧ ¬Q(?z, a, ?z)THEN assert R(?x)retract Q(?x, a, ?z) (2) IF R(?x) ∧ Q(?x, ?y, ?z)THEN retract R(?x) (3) IF P(f(?x), ?y) ∧ ?x ≤ ?y ∧ Q(f(?w), ?w, f(?z)) ∧ ?y ≤ ?zTHEN print SUCCESS! (4) IF R(?x)THEN assert Q(f(?x), ?x, f(?x))Let the initial contents of working memory be:P(f(1), 2) P(3, f(4)) P(4, 4)Q(f(1), a, f(1)) Q(4, a, 3)R(f(1))Fill in the table below, showing the first three (3) cycles of the production system’s operation. Cycle Matching Rules Chosen Rule i ii iiiList the net changes to working memory that result from the above three cycles:4Problem 3 – Bayesian and Case-Based Reasoning (17 points)Imagine you wish to recognize bad “widgets” produced by your factory. You’re able to measure twonumeric properties of each widget: P1 and P2. The value of each property is discretized to be one of{low (L), normal (N), high(H)}. You randomly grab 5 widgets off of your assembly line and extensivelytest whether or not they are good, obtaining the following results:P1 P2 Result L N good H L bad N H good L H bad N N goodExplain how you could use this data and Bayes’ Rule to determine whether the following new widget ismore likely to be a good or a bad one (be sure to show your work and explain anyassumptions/simplifications you make): L L ?How might a case-based reasoning system address this task? How would it handle the test exampleabove?5Problem 4 – Neural Networks (16 points)Using a 1-of-N encoding, show how a perceptron could be configured to address the “widget” task ofProblem 3. Initialize all the weights to 0 and the threshold to 0.1.Show how one would apply the the delta rule to each of the first two training examples in Problem 3(show the state of the perceptron after each example). Use the step function as the activation functionand let good=1 and bad=0. Set η (the learning rate) to 0.25. Be sure to adjust the threshold duringtraining.Assuming that all the other weights and the threshold have value 0 and using the training set fromProblem 3, draw the weight space for the weight between the P1=L input node and the output node.6Problem 5 – Genetic Algorithms (10 points)Consider the following fitness function:fitness = 5a + 3bc – d + 2ewhere a-e are all Boolean-valued parameters. Compute the fitness of each of the members of the initialpopulation below. Also compute the probability that each member of the population will be selectedduring the fitness-proportional reproduction process.a b c d e fitness reproduction probability1 1 0 1 10 1 1 0 11 1 0 0 01 0 1 1 11 0 0 0 0Assuming the first two of members of the population are selected for reproduction, and the cross-overpoint is that between the b and the c, show the resulting children:7Problem 6 – Miscellaneous Short Answers (12 points)Briefly describe each of the following AI concepts:Heuristic FunctionsHidden Markov Models (HMMs)Linear SeparabilityA*8Problem 7 – Markov Models (10 points)Assume that User A and User B equally share a computer, and that you wish to write a program thatdetermines which person is currently using the computer. You choose to create a (first-order) MarkovModel that characterizes each user’s typing behavior. You decide to group their keystrokes into threeclasses and have estimated the transition probabilities, producing the two graphs below. Both usersalways start in the Other state upon logging in.Now imagine that the current user logs on and immediately types the following:IOU $15Who is more likely to be the current user? Show and explain your calculations.LetterDigitOther0.10.10.30.10.60.10.80.60.3User BLetterDigitOther0.20.10.40.20.50.10.70.50.3User A9Problem 8 – Gradient Descent on an Error Function (7 points)Derive the weight-change rule for a perceptron where the following is the error function and theactivation function of the output unit is the identity function (i.e., the output equals the incoming weightedsum and no threshold is involved).Error ≡ output -


View Full Document

UW-Madison COMPSCI 540 - Final Exam

Download Final Exam
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Final Exam and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Final Exam 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?