Growth RatesConstant FactorsBig-Oh NotationBig-Oh ExampleSlide 5Big-Oh and Growth RateBig-Oh RulesSlide 8Intuition for Asymptotic NotationSlide 10Divide-and-ConquerMerge-Sort ReviewRecurrence Equation AnalysisIterative SubstitutionSolving Recurrences by Substitution: Guess-and-TestSolving Recurrences by Substitution: 2Solving Recurrences by Substitution: 3More ExamplesSlide 19Solving Recurrences: Recursion-tree methodSlide 21Slide 22Master MethodAnalysis of Algorithms 1Growth RatesGrowth rates of functions:Linear nQuadratic n2Cubic n3In a log-log chart, the slope of the line corresponds to the growth rate of the function1E+01E+21E+41E+61E+81E+101E+121E+141E+161E+181E+201E+221E+241E+261E+281E+301E+0 1E+2 1E+4 1E+6 1E+8 1E+10nT (n )CubicQuadraticLinearAnalysis of Algorithms 2Constant FactorsThe growth rate is not affected byconstant factors or lower-order termsExamples102n 105 is a linear function105n2 108n is a quadratic function1E+01E+21E+41E+61E+81E+101E+121E+141E+161E+181E+201E+221E+241E+261E+0 1E+2 1E+4 1E+6 1E+8 1E+10nT (n )QuadraticQuadraticLinearLinearAnalysis of Algorithms 3Big-Oh NotationGiven functions f(n) and g(n), we say that f(n) is O(g(n)) if there are positive constantsc and n0 such thatf(n) cg(n) for n n0Example: 2n 10 is O(n)2n 10 cn(c 2) n 10n 10(c 2)Pick c 3 and n0 101101001,00010,0001 10 100 1,000n3n2n+10nAnalysis of Algorithms 4Big-Oh ExampleExample: the function n2 is not O(n)n2 cnn cThe above inequality cannot be satisfied since c must be a constant 1101001,00010,000100,0001,000,0001 10 100 1,000nn^2100n10nnAnalysis of Algorithms 5More Big-Oh Examples7n-27n-2 is O(n)need c > 0 and n0 1 such that 7n-2 c•n for n n0this is true for c = 7 and n0 = 13n3 + 20n2 + 53n3 + 20n2 + 5 is O(n3)need c > 0 and n0 1 such that 3n3 + 20n2 + 5 c•n3 for n n0this is true for c = 4 and n0 = 213 log n + log log n3 log n + log log n is O(log n)need c > 0 and n0 1 such that 3 log n + log log n c•log n for n n0this is true for c = 4 and n0 = 2Analysis of Algorithms 6Big-Oh and Growth RateThe big-Oh notation gives an upper bound on the growth rate of a functionThe statement “f(n) is O(g(n))” means that the growth rate of f(n) is no more than the growth rate of g(n)We can use the big-Oh notation to rank functions according to their growth ratef(n) is O(g(n)) g(n) is O(f(n))g(n) grows more Yes Nof(n) grows more No YesSame growth Yes YesAnalysis of Algorithms 7Big-Oh RulesIf is f(n) a polynomial of degree d, then f(n) is O(nd), i.e.,1. Drop lower-order terms2. Drop constant factorsUse the smallest possible class of functionsSay “2n is O(n)” instead of “2n is O(n2)”Use the simplest expression of the classSay “3n 5 is O(n)” instead of “3n 5 is O(3n)”Analysis of Algorithms 8Relatives of Big-Ohbig-Omegaf(n) is (g(n)) if there is a constant c > 0 and an integer constant n0 1 such that f(n) c•g(n) for n n0big-Thetaf(n) is (g(n)) if there are constants c’ > 0 and c’’ > 0 and an integer constant n0 1 such that c’•g(n) f(n) c’’•g(n) for n n0little-ohf(n) is o(g(n)) if, for any constant c > 0, there is an integer constant n0 > 0 such that f(n) < c•g(n) for n n0little-omegaf(n) is (g(n)) if, for any constant c > 0, there is an integer constant n0 > 0 such that f(n) > c•g(n) for n n0Analysis of Algorithms 9Intuition for Asymptotic NotationBig-Ohf(n) is O(g(n)) if f(n) is asymptotically less than or equal to g(n)big-Omegaf(n) is (g(n)) if f(n) is asymptotically greater than or equal to g(n)big-Thetaf(n) is (g(n)) if f(n) is asymptotically equal to g(n)little-ohf(n) is o(g(n)) if f(n) is asymptotically strictly less than g(n)little-omegaf(n) is (g(n)) if is asymptotically strictly greater than g(n)Analysis of Algorithms 10Example Uses of the Relatives of Big-Ohf(n) is (g(n)) if, for any constant c > 0, there is an integer constant n0 > 0 such that f(n) > c•g(n) for n n0need 5n02 > c•n0 given c, the n0 that satifies this is n0 > c/5 > 05n2 is (n)f(n) is (g(n)) if there is a constant c > 0 and an integer constant n0 1 such that f(n) c•g(n) for n n0let c = 1 and n0 = 15n2 is (n)f(n) is (g(n)) if there is a constant c > 0 and an integer constant n0 1 such that f(n) c•g(n) for n n0let c = 5 and n0 = 15n2 is (n2)Analysis of Algorithms 11Divide-and-ConquerDivide-and conquer is a general algorithm design paradigm:Divide: divide the input data S in two or more disjoint subsets S1, S2, …Recur: solve the subproblems recursivelyConquer: combine the solutions for S1, S2, …, into a solution for SThe base case for the recursion are subproblems of constant sizeAnalysis can be done using recurrence equationsAnalysis of Algorithms 12Merge-Sort ReviewMerge-sort on an input sequence S with n elements consists of three steps:Divide: partition S into two sequences S1 and S2 of about n2 elements eachRecur: recursively sort S1 and S2Conquer: merge S1 and S2 into a unique sorted sequenceAlgorithm mergeSort(S, C)Input sequence S with n elements, comparator C Output sequence S sortedaccording to Cif S.size() > 1(S1, S2) partition(S, n/2) mergeSort(S1, C)mergeSort(S2, C)S merge(S1, S2)Analysis of Algorithms 13Recurrence Equation AnalysisThe conquer step of merge-sort consists of merging two sorted sequences, each with n2 elements and implemented by means of a doubly linked list, takes at most bn steps, for some constant b.Likewise, the basis case (n < 2) will take at b most steps.Therefore, if we let T(n) denote the running time of merge-sort:We can therefore analyze the running time of merge-sort by finding a closed form solution to the above equation.That is, a solution that has T(n) only on the left-hand side.2if)2/(22if )(nbnnTnbnTAnalysis of Algorithms 14Iterative SubstitutionIn the iterative substitution, or “plug-and-chug,” technique, we iteratively apply the recurrence equation to itself and see if we can find a pattern:Note that base, T(n)=b, case occurs when 2i=n. That is, i = log n. So,Thus, T(n) is O(n log n).ibnnTbnnTbnnTbnnTbnnbnTbnnTnTii)2/(2...4)2/(23)2/(22)2/(2))2/())2/(2(2)2/(2)(4433222nbnbnnT log)( COT 5407 15Solving
View Full Document