DOC PREVIEW
UMD CMSC 351 - Lecture Notes

This preview shows page 1-2 out of 5 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 5 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 5 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 5 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Lecture Notes CMSC 251We have this messy summation to solve though. First observe that the value n remains constantthroughout the sum, and so we can pull it out front. Also note that we can write 3i/4iand (3/4)i.T (n)=nlog43+ n(log4n)−1Xi=034i.Note that this is a geometric series. We may apply the formula for the geometric series, which gave inan earlier lecture. For x 6=1:mXi=0xi=xm+1− 1x − 1.In this case x =3/4and m = log4n − 1. We getT (n)=nlog43+ n(3/4)log4n− 1(3/4) − 1.Applying our favorite log identity once more to the expression in the numerator (with a =3/4andb =4)weget(3/4)log4n= nlog4(3/4)= n(log43−log44)= n(log43−1)=nlog43n.If we plug this back in, we haveT (n)=nlog43+ nnlog43n− 1(3/4) − 1= nlog43+nlog43− n−1/4= nlog43− 4(nlog43− n)= nlog43+4(n−nlog43)=4n−3nlog43.So the final result (at last!) isT (n)=4n−3nlog43≈ 4n − 3n0.79∈ Θ(n).It is interesting to note the unusual exponent of log43 ≈ 0.79. We have seen that two nested loops typi-cally leads to Θ(n2) time, and three nested loops typically leads to Θ(n3) time, so it seems remarkablethat we could generate a strange exponent like 0.79 as part of a running time. However, as we shallsee, this is often the case in divide-and-conquer recurrences.Lecture 8: More on Recurrences(Thursday, Feb 19, 1998)Read: Chapt. 4 on recurrences, skip Section 4.4.Recap: Last time we discussed recurrences, that is, functions that are defined recursively. We discussedtheir importance in analyzing divide-and-conquer algorithms. We also discussed two methods for solv-ing recurrences, namely guess-and-verify (by induction), and iteration. These are both very powerfulmethods, but they are quite “mechanical”, and it is difficult to get a quick and intuitive sense of whatis going on in the recurrence. Today we will discuss two more techniques for solving recurrences. Thefirst provides a way of visualizing recurrences and the second, called the Master Theorem, is a methodof solving many recurrences that arise in divide-and-conquer applications.27Lecture Notes CMSC 251Visualizing Recurrences Using the Recursion Tree: Iteration is a very powerful technique for solving re-currences. But, it is easy to get lost in all the symbolic manipulations and lose sight of what is goingon. Here is a nice way to visualize what is going on in iteration. We can describe any recurrence interms of a tree, where each expansion of the recurrence takes us one level deeper in the tree.Recall that the recurrence for MergeSort (which we simplified by assuming that n is a power of 2, andhence could drop the floors and ceilings)T (n)=1 if n =1,2T(n/2) + n otherwise.Suppose that we draw the recursion tree for MergeSort, but each time we merge two lists, we label thatnode of the tree with the time it takes to perform the associated (nonrecursive) merge. Recall that tomerge two lists of size m/2 to a list of size m takes Θ(m) time, which we will just write as m. Belowis an illustration of the resulting recursion tree.= nnnn2( /2) =n4( /4) =n/4n/4n/4n/4n/2n/2n+T(n/4)T(n/2)T(n)T(n/2)...n(lg +1)nn(n/n) = n...11+ 1 levels++11nlgFigure 5: Using the recursion tree to visualize a recurrence.Observe that the total work at the topmost level of the recursion is Θ(n) (or just n for short). At thesecond level we have two merges, each taking n/2 time, for a total of 2(n/2) = n. At the third level wehave 4 merges, each taking n/4 time, for a total of 4(n/4) = n. This continues until the bottommostlevel of the tree. Since the tree exactly lg n +1levels (0, 1, 2,...,lg n), and each level contributes atotal of n time, the total running time is n(lg n +1)=nlg n + n. This is exactly what we got by theiteration method.This can be used for a number of simple recurrences. For example, let’s try it on the following recur-rence. The tree is illustrated below.T (n)=1 if n =1,3T(n/2) + n2otherwise.Again, we label each node with the amount of work at that level. In this case the work for T (m) is m2.For the top level (or 0th level) the work is n2. At level 1 we have three nodes whose work is (n/2)2each, for a total of 3(n/2)2. This can be written as n2(3/4). At the level 2 the work is 9(n/4)2, whichcan be written as n2(9/16). In general it is easy to extrapolate to see that at the level i,wehave3inodes, each involving (n/2i)2work, for a total of 3i(n/2i)2= n2(3/4)i.This leads to the following summation. Note that we have not determined where the tree bottoms out,so we have left off the upper bound on the sum.T (n)=n2?Xi=034i.28Lecture Notes CMSC 251T(n/2)T(n/2)T(n/4)T(n/2)T(n)(n/4)2...22n+++= n...9(n/4) = n (9/16)223(n/2) = n (3/4)(n/4)2(n/4)2(n/4)2(n/4)2(n/4)22(n/2)2(n/2)2(n/2)2in (3/4)22Figure 6: Another recursion tree example.If all we wanted was an asymptotic expression, then are essentially done at this point. Why? Thesummation is a geometric series, and the base (3/4) is less than 1. This means that this series convergesto some nonzero constant (even if we ran the sum out to ∞). Thus the running time is Θ(n2).To get a more exact result, observe that the recursion bottoms out when we get down to single items,and since the sizes of the inputs are cut by half at each level, it is not hard to see that the final level islevel lg n. (It is easy to be off by ±1 here, but this sort of small error will not affect the asymptoticresult. In this case we happen to be right.) So, we can plug in lg n for the “?” in the above summation.T (n)=n2lg nXi=034i.If we wanted to get a more exact answer, we could plug the summation into the formula for the geo-metric series and simplify. This would lead to an expression likeT (n)=n2(3/4)lg n+1− 1(3/4) − 1.This will take some work to simplify, but at this point it is all just tedious algebra to get the formulainto simple form. (This sort of algebraic is typical of algorithm analysis, so be sure that you followeach step.)T (n)=n2(3/4)lg n+1− 1(3/4) − 1= −4n2((3/4)lg n+1− 1)=4n2(1 − (3/4)lg n+1)=4n2(1 − (3/4)(3/4)lg n)=4n2(1 − (3/4)nlg(3/4))=4n2(1 − (3/4)nlg 3−lg 4)=4n2(1 − (3/4)nlg 3−2)=4n2(1 − (3/4)(nlg 3/n2))=4n2−3nlg 3.Note that lg 3 ≈ 1.58, so the whole expression is Θ(n2).In conclusion, the technique of drawing the recursion tree is a somewhat more visual way of analyzingsummations, but it is really equivalent to the method of iteration.(Simplified) Master Theorem: If you analyze many divide-and-conquer algorithms, you will see that thesame


View Full Document
Download Lecture Notes
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Lecture Notes and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Lecture Notes 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?