Unformatted text preview:

CS211 Recitation: 27-29 April 2004Analysis of execution timeWe want to go over three things that studentsdo not seem to grasp fully.1. Definition of O(n) and how o use it formally.2. Figuring out the execution time of a loopy pro-gram segment.3. Figuring out the execution time of a recursivemethod.1. Definition of O(n).Students, you have to memorize this definition andbe able to use it. See also Weiss chap. 5.4Definition: Function f(n) is O(g(n)) if there arepositive constants c and N0 such thatf(n) ≤ c*g(n)for all n > N0.Interpretation: Draw the curves f(n) and c*g(n)for positive n in the x-y plane. After point n = N0,the c*g(n) curve will ever after be above the f(n)curve.Problems: You should be able to determine theseby using the formal definition. We give one exam-ple.1. Prove that function n+5 is O(n).2. Prove that 2*n+5 is O(n).3. Prove that .5*n2 + n is O(n2).4. Prove that 5*n2 is O(n4).5. Prove that 1/n is O(1).6. Prove that log(n) is O(n).7. Prove that n*log(n) is O(n2).8. Prove that 8*n2 is O(n3 + n)9. Prove that 8*n2 + 1000n is O(n2).10. Make up 10 of your own interesting things(like the above 9) to prove or disprove.We prove one of them here, number 2:Theorem:2*n+5 is O(n).We have to find positive c and N0 such that for alln > N0,2*n + 5 <= c*nChoosing c = 2 doesn’t work, because then it reads2*n + 5 <= 2*n , which is false for all n > 0, so wehave to choose a bigger c. Try c = 3. We have:2*n + 5 <= 3*n= <subtract 2*n from both sides>5 <= n.This holds for all n>4, so choose N0 = 4.Q.E.D. (Quit End Done)2. Loopy program segments.We deal only with worst-case or best-case execu-tion time. Not average or expected time.We should be able to find the order of execu-tion time of a loop fairly easily. Consider a looplike this (it could also be a for-loop):int k= 0;while (expression) {Sk= k+1;}We have to figure out the time statement S takeswhen k = 0, when k = 1, when k = 2, etc. and addthese times up. Here, we may count the number ofassignments in S, the number of array comparisonswhatever seems to be the dominant part of S.If S takes constant time —i.e. the time it takesdoes not depend on k—then the time is simply (number of iterations) * (time for S)Nested loops can complicate matters. You have toread carefully and understand the algorithm. Wegive two examples:Consider insertion sort:// Sort b[0..n-1]// invariant: b[0..k-1] is sorted.for (k= 0; k != n, k= k+1) {insert b[k] into its sorted position in the already sorted segment b[0..k-1]}The best case time occurs when the inner looptakes constant time, and this happens if the array isalready in ascending order.The worst case time occurs when the array is indescending order, because inserting b[k] into itssorted position takes time O(k). Add up 1 + 2 + 3+ … + n and you get n*(n+1)/2, giving O(n*n).The next example shows the extreme care youmust take in analyzing a program segment. Justbecause there are nested loops, the execution timeis not necessarily n*n! Careful study of the fol-lowing program segment will show that, eventhough it has nested loops, the time is still O(n).Why?/* Store in x the number of sections of equal ele-ments of array b[0..n-1], for n>=0. For example,for the array b = (3,5,5,3,3,3,3,6,6,6), the value 4 isstored in x: there is a section of (one) 3’s, then asection of 5’s, then a section of 3’s, and finally asection of 6’s. */int x= 0; int k= 0;/* inv P: 0 <= k <= n. x = no of sections of equal elements in b[0..k-1]. the following holds: k=0 or k=n or b[k-1] != b[k] */while (k != n) {x= x+1;// Let v be the value in b[k] at this point.// Increase k until either k = n or b[k] != vk= k+1;// invariant: 1 <= k <= n, and// x = no. sections of equal elements in b[0..k-1],while (k != n && b[k-1] = b[k]) { k= k+1;}}// R: x = no. sections of equal elements in b[0..n-1]We will not give you a bunch of program seg-ments to practice on (in determining their execu-tion time). You have examples already. Look at allthe algorithms we have developed in class andrecitation. Look at algorithms in Weiss. Spendsome time figuring out their order execution time.3. Figuring out time of recursive methodsThere is a formal and an informal way of fig-uring out the time required by a recursive function.The formal way requires coming up with a recur-rence relation that defines the time and then solv-ing the recurrence relation. We will not ask you todo this, but here is an example. (There was anotherexample with mergesort in recitation 8.)Consider this method./** Process each node of t in inorder */public static void p(Tnode t) {if (t == null)return;p(T.left);process root t;p(T.right);}Assume that “process root t takes time K for someinteger K (it takes constant time). The time for theempty tree is a constant K2. Then the time T(n) fora tree of n nodes can be written as follows:T(0) = K2T(n) = K2 + T(size of left tree) + K + T(size of right tree)One now has to solve this recurrence relation. Wedon’t go into this here, because we do not requireyou to do it. That is a topic of CS280.The less formal way of figuring out executiontime of a recursive method is to judge the timebased on what the method is doing (and how).Consider the same algorithm. It is evident thateach node of the tree is processed once. If the timeto process a node is constant, then the algorithmtakes time proportional to n, the number of nodesin the tree.We had precisely this kind of problem on pre-lim 2, and many students got it wrong.Consider this recursive function:/** Set each b[k] of b[0..k] to sum of b[0..k]. Example: sumit(b, 3) changes b = (3, 2, 4, 5, 1) to (3, 5, 9, 14, 1) */public static void sumit(int[] b, int k) {if (k <= 0) return;sumit(b, k-1); b[k]= b[k-1] + b[k];}The if-statement and assignment take constanttime, say T. Each recursive call has its second ar-gument reduced by 1, and we can see that there arecalls with second argument k, k-1, k-2, …, 1, 0.Thus, the time it takes O(T*(k+1)), which is O(k)since T is a constant. It is a linear algorithm.You have written many recursive methods inthis class. Look at them and figure out their exe-cution time. Here’s one from assignment A5./** #12 = a list that represents the intersection of l1 and l2. precondition: no dups in l1 and l2 */public static RList intersect(RList l1, RList l2){if (isEmpty(l1)) return null;if (!isMember(first(l1),l2))return intersect(rest(l1),l2);return


View Full Document

CORNELL CS 211 - Study References

Documents in this Course
B-Trees

B-Trees

10 pages

Hashing

Hashing

3 pages

Load more
Download Study References
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Study References and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Study References 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?