Unformatted text preview:

16.070 — March 5/2003 — Prof. I. K. Lundqvist — [email protected] to Computers & ProgrammingAsymptotic analysis: upper/lower bounds, Θ notationBinary, Insertion, and Merge sortProf. Kristina LundqvistDept. of Aero/Astro, MIT16.070 — March 5/2003 — Prof. I. K. Lundqvist — [email protected] Analysis§ Complexity: rate at which storage or time grows as a function of the problem size§ Growth depends on compiler, machine, …§ Asymptotic analysis§ Describing the inherent complexity of a program, independent of machine and compiler.§ A “proportionality” approach, expressing the complexity in terms of its relationship to some known function.16.070 — March 5/2003 — Prof. I. K. Lundqvist — [email protected] Analysis§ Idea: as problem size grows, the complexity can be described as a simple proportionality to some known function. “Big Oh”16.070 — March 5/2003 — Prof. I. K. Lundqvist — [email protected] Analysis: Big-ohDefinition: T(n) = O(f(n)) -- T of n is in Big Oh of f of niff there are constants c and n0such that T(n) ≤ cf(n) for all n ≥ n0Usage: The algorithm is in O(n2) in [best, average, worst] case.Meaning: For all data sets big enough (i.e., n>n0), the algorithm always executes in less than cf(n) steps in [best, average, worst] case.Big oh is said to describe an “upper bound” on the complexity.16.070 — March 5/2003 — Prof. I. K. Lundqvist — [email protected] Notation (cont)Big-oh notation indicates an upper bound on the complexityExample: If T(n) = 3n2then T(n) is in O(n2).Wish tightest upper bound:While T(n) = 3n2is in O(n25), we prefer O(n2).16.070 — March 5/2003 — Prof. I. K. Lundqvist — [email protected]§ T(n) = O(1) constant growth§ E.g., array access§ T(n) = O(lg(n)) logarithmic growth§ E.g., binary search§ T(n) = O(n) linear growth§ E.g., looping over all elements in a 1-dimensional array§ T(n) = O(n log n) “n log n” growth§ E.g., Merge sort§ T(n) = O(nk) polynomial growth§ E.g., Selection sort is n2, k seldom larger than 5§ T(n) = O(2n) exponential growth§ E.g., basically useless for anything but very small problems16.070 — March 5/2003 — Prof. I. K. Lundqvist — [email protected] billion instructions/second4x1013years1 ms10 µs0.66 µs0.1 µs10013 days125 µs2.5 µs0.28 µs0.05 µs501ms8 µs0.4 µs0.09 µs0.02 µs201 µs1 µs0.1 µs0.03 µs0.1 µs100.03 µs0.13 µs0.03 µs0.01 µs0.005 µs5T(n)=2nT(n)=n3T(n)=n2T(n)=n lg(n)T(n)=nn16.070 — March 5/2003 — Prof. I. K. Lundqvist — [email protected] ExamplesExample 1: Finding value X in an array (average cost).T(n) = csn/2.For all values of n > 1, csn/2 <= csn.Therefore, by the definition, T(n) is in O(n) forn0= 1 and c = cs.16.070 — March 5/2003 — Prof. I. K. Lundqvist — [email protected] ExamplesExample 2: T(n) = c1n2+ c2n in average case.c1n2+ c2n <= c1n2+ c2n2<= (c1+ c2)n2for all n > 1.T(n) <= cn2for c = c1+ c2and n0= 1.Therefore, T(n) is in O(n2) by the definition.Example 3: T(n) = c. We say this is in O(1).16.070 — March 5/2003 — Prof. I. K. Lundqvist — [email protected] Big-oh tell the whole story?§ T1(n) = T2(n) = O(g(n))§ Algorithm 1§ Setup of algorithm -- takes 50 time unitsread n elements into array -- 3 units/elementfor i 1..ndo operation1 on A[i] -- takes 10 unitsdo operation2 on A[i] -- takes 5 unitsdo operation3 on A[i] -- takes 15 units§ Algorithm 2§ Setup of algorithm -- takes 200 time unitsread n elements into array -- 3 units/elementfor i 1..ndo operation1 on A[i] -- takes 10 unitsdo operation2 on A[i] -- takes 5 unitsT1(n)=50+3n+(10+5+15)n = 50+33nT2(n)=200+3n+(10+5)n = 200+18n16.070 — March 5/2003 — Prof. I. K. Lundqvist — [email protected]: For T(n) a non-negatively valued function, T(n) is in the set Ω(g(n)) if there exist two positive constants c and n0such that T(n) >= cg(n) for all n > n0.Meaning: For all data sets big enough (i.e., n > n0), the algorithm always executes in more than cg(n) steps.Lower bound.16.070 — March 5/2003 — Prof. I. K. Lundqvist — [email protected] ExampleT(n) = c1n2+ c2n.c1n2+ c2n >= c1n2for all n > 1.T(n) >= cn2for c = c1and n0= 1.Therefore, T(n) is in Ω(n2) by the definition.We want the greatest lower bound.16.070 — March 5/2003 — Prof. I. K. Lundqvist — [email protected] NotationWhen big-Oh and Ω meet, we indicate this by using Θ (big-Theta) notation.Definition: An algorithm is said to be Θ(h(n)) if it is in O(h(n)) and it is in Ω(h(n)).Tight bound.16.070 — March 5/2003 — Prof. I. K. Lundqvist — [email protected] Common MisunderstandingConfusing worst case with upper bound.Upper bound refers to a growth rate.Worst case refers to the worst input from among the choices for possible inputs of a given size.16.070 — March 5/2003 — Prof. I. K. Lundqvist — [email protected] Rules1. If f(n) is in O(g(n)) and g(n) is in O(h(n)),then f(n) is in O(h(n)).2. If f(n) is in O(kg(n)) for any constant k > 0, then f(n) is in O(g(n)).3. If f1(n) is in O(g1(n)) and f2(n) is in O(g2(n)),then (f1+ f2)(n) is in O(max(g1(n), g2(n))).4. If f1(n) is in O(g1(n)) and f2(n) is in O(g2(n))then f1(n)f2(n) is in O(g1(n)g2(n)).16.070 — March 5/2003 — Prof. I. K. Lundqvist — [email protected] SearchHow many elements are examined in worst case?16.070 — March 5/2003 — Prof. I. K. Lundqvist — [email protected] SearchProcedure Binary_Search (Input_Array, Number_To_Search, Return_Index);BeginSet Return_Index to –1;Set Current_Index to (Upper_Bound - Lower_Bound + 1) /2.Loopif the lower_bound > upper_boundExit;end ifif ( Input_Array(Current_Index) = Number_to_Search)thenReturn_Index := Current_IndexExit;end ifif ( Input_Array(Current_Index) < Number_to_Search) thenLower_Bound := Current_Index +1elseUpper_Bound := Current_Index – 1end ifend loopend Binary_Search;16.070 — March 5/2003 — Prof. I. K. Lundqvist — [email protected] sort§ InsertionSort(A, n) -- A[1..n]for j 2..ndo key := A[j]i := j-1while i > 0 and A[i] > keyA[i+1] := A[i]i := i-1A[i+1]:= key16.070 — March 5/2003 — Prof. I. K. Lundqvist — [email protected] sort8 2 4 9 3 62 8 4 9 3 62 4 8 9 3 62 4 8 9 3 62 3 4 8 9 62 3 4 6 8 916.070 — March 5/2003 — Prof. I. K. Lundqvist — [email protected] sort§ Running time§ Depends on the input§ An already sorted sequence is easier to sort§ Worst case: input reverse


View Full Document

MIT 16 070 - Asymptotic analysis

Documents in this Course
optim

optim

20 pages

Load more
Download Asymptotic analysis
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Asymptotic analysis and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Asymptotic analysis 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?