CMU CS 15122 - Lecture Notes on Dynamic Programming (9 pages)

Previewing pages 1, 2, 3 of 9 page document View the full content.
View Full Document

Lecture Notes on Dynamic Programming



Previewing pages 1, 2, 3 of actual document.

View the full content.
View Full Document
View Full Document

Lecture Notes on Dynamic Programming

137 views


Pages:
9
School:
Carnegie Mellon University
Course:
Cs 15122 - Principles of Imperative Computation
Unformatted text preview:

Lecture Notes on Dynamic Programming 15 122 Principles of Imperative Computation Frank Pfenning Lecture 23 November 16 2010 1 Introduction In this lecture we introduce dynamic programming which is a high level computational thinking concept rather than a concrete algorithm Perhaps a more descriptive title for the lecture would be sharing because dynamic programming is about sharing computation We have already seen earlier that sharing of space is also crucial binary decision diagrams in which subtrees are shared are in practice much more efficient than binary decision trees in which there is no sharing In order to apply dynamic programming we generally look for the following conditions 1 The optimal solutions to a problem is composed of optimal solutions to subproblems and 2 if there are several optimal solutions we don t care which one we get The emphasis on optimality in these conditions dates back to the 1930 s when dynamic programming was developed The name also refers to programming in the sense of the operations research literature like for example integer programming and does not refer to programming the way we understand today Under the above conditions the idea of dynamic programming is to build an exhaustive table with optimal solutions to subproblems Then we combine these solutions according to properties of the specific problem we are addressing The use of a table avoids recomputation it shares computation by storing results and avoiding their recomputation L ECTURE N OTES N OVEMBER 16 2010 Dynamic Programming L23 2 We start with a simple example and the discuss the implementation of BDDs which uses dynamic programming in several places 2 Fibonacci Numbers As a very simple example we consider the computation of the Fibonacci numbers They are defined by specifying mathematically f0 0 f1 1 fn 2 fn 1 fn n 0 A direct and very inefficient implementation is a recursive function int fib0 int n REQUIRES n 0 if n 0 return 0 else if n 1 return 1 else return fib0 n 1 fib0 n 2 When we draw the top part of a tree of the recursive calls that have to be made we notice that values are computed multiple times b n b n 1 b n 2 b n 2 b n 3 b n 3 b n 4 Before we move on to improve the efficience of this program did you notice that the program above is buggy in C and the corresponding version would be questionable in C0 Think about it before reading on L ECTURE N OTES N OVEMBER 16 2010 Dynamic Programming L23 3 The problem is that addition can overflow In C the result is undefined and arbitrary behavior by the code would be acceptable In C0 the result would be defined but it would be the result of computing modulo 232 which could be clearly the wrong answer We can fix this by explicitly checking for overflow include limits h int safe plus int x int y if x 0 y INT MAX x x 0 y INT MIN x fprintf stderr integer overflow n abort else return x y int fib1 int n REQUIRES n 0 if n 0 return 0 else if n 1 return 1 else return safe plus fib1 n 1 fib1 n 2 Now the program will abort in a well defined manner upon overflow instead of exhibiting undefined behavior which might silently give us the wrong result 3 Top Down Dynamic Programming In top down dynamic programming we store the values as we compute them recursively Then if we need to compute a value we just reuse the value if we have computed it already A characteristic pattern for top down dynamic programming is a top level function that allocates an array or similar structure to save computed results and a recursive function that maintains this array L ECTURE N OTES N OVEMBER 16 2010 Dynamic Programming L23 4 int fib2 rec int n int A REQUIRES n 0 if n 0 return 0 else if n 1 return 1 else if A n 0 return A n else int result safe plus fib2 rec n 1 A fib2 rec n 2 A A n result store A n fib n return result int fib2 int n REQUIRES n 0 int A calloc n 1 sizeof int if A NULL fprintf stderr allocation failed n abort calloc initializes the array with 0s int result fib2 rec n A free A return result We also call this programming technique memoization We might be tempted to improve this function slightly by looking up the second value int fib2 rec int n int A REQUIRES n 0 if n 0 return 0 else if n 1 return 1 else if A n 0 return A n else int result safe plus fib2 rec n 1 A A n 2 A n result store A n fib n return result This would be incorrect but why L ECTURE N OTES N OVEMBER 16 2010 Dynamic Programming L23 5 The problem is that C does not guarantee the order of evaluation except in some specific circumstances such as short circuiting conjunction So it might evaluate A n 2 first before fib2 rec n 1 A At that point A n 2 might still be 0 and we obtain an incorrect answer 4 Bottom Up Dynamic Programming Top down dynamic programming retains the structure of the original inefficient recursive function Bottom up dynamic programming inverts the order and starts from the bottom of the recursion building up the table of values In bottom up dynamic programming recursion is often profitably replaced by iteration In our example we would like to compute A 0 A 1 A 2 in this order int fib3 int n REQUIRES n 0 int i int A calloc n 1 sizeof int if A NULL fprintf stderr allocation failed n abort A 0 0 A 1 1 for i 2 i n i loop invariant 2 i i n 1 loop invariant A i fib i for i in 0 i A i safe plus A i 1 A i 2 ASSERT i n 1 int result A n free A return result We have indicated the loop invariant here only informally although we could refer to one of the earlier implementations if we wanted to perhaps viewing fib0 as a specification 5 Implementing ROBDDs In the implementation of ROBDDs dynamic programming plays a pervasive role Recall from Lecture 19 that ROBDDs are binary decision diagrams L ECTURE N OTES N OVEMBER 16 2010 Dynamic Programming L23 6 BDDs satisfying two conditions Irredundancy The low and high successors of every node are distinct Uniqueness There are no two distinct nodes testing the same variable with the same successors These two conditions guarantee canonicity of the representation of boolean functions and also make the data structure efficient in many common cases We use dynamic programming ideas in order to maintain the uniqueness invariant irredundancy is simple in comparison The idea is that whenever we are asked to construct a new node given a low and high successor and a variable to test we look in a …


View Full Document

Access the best Study Guides, Lecture Notes and Practice Exams

Loading Unlocking...
Login

Join to view Lecture Notes on Dynamic Programming and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Lecture Notes on Dynamic Programming and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?