Slide 1OverviewWhat do we want to analyze?Gauging performanceGauging performance (cont.)Big-OhBig Oh (cont.)ExamplesExamples (cont.)Examples (cont.)What’s with the c?Big Oh: Common CategoriesCaveatsCaveatsMiscellaneousAnalyzing code (“worst case”)Analyzing codeBig Oh’s FamilyRegarding use of termsRecurrence RelationsRecursive version of sum arrayRecurrence Relations (cont.)Example: Find kLinear searchBinary searchBinary searchSolving Recurrence RelationsLinear vs Binary SearchWhat about a binary version of sum?Really common recurrencesCSE332: Data AbstractionsLecture 3: Asymptotic AnalysisTyler RobisonSummer 20101Overview2Asymptotic analysisWhy we careBig Oh notationExamplesCaveats & miscellanyEvaluating an algorithmBig Oh’s familyRecurrence relationsWhat do we want to analyze?3CorrectnessPerformance: Algorithm’s speed or memory usage: our focusChange in speed as the input growsn increases by 1n doublesComparison between 2 algorithmsSecurityReliabilityGauging performance4Uh, why not just run the program and time it?Too much variability; not reliable:Hardware: processor(s), memory, etc.OS, version of Java, libraries, driversPrograms running in the backgroundImplementation dependentChoice of inputTiming doesn’t really evaluate the algorithm; it evaluates its implementation in one very specific scenarioGauging performance (cont.)5At the core of CS is a backbone of theory & mathematicsExamine the algorithm itself, mathematically, not the implementationReason about performance as a function of nBe able to mathematically prove things about performanceYet, timing has its placeIn the real world, we do want to know whether implementation A runs faster than implementation B on data set CEx: Benchmarking graphics cardsWill do some timing in project 3 (and in 2, a bit)Evaluating an algorithm? Use asymptotic analysisEvaluating an implementation of hardware/software? Timing can be usefulBig-Oh6Say we’re given 2 run-time functions f(n) & g(n) for input nThe Definition: f(n) is in O(g(n) ) iff there exist positive constants c and n0 such thatf(n) c g(n), for all n n0.The Idea: Can we find an n0 such that g is always greater than f from there on out?We are allowed to multiply g by a constant value (say, 10) to make g largerO(g(n)) is really a set of functions whose asymptotic behavior is less than or equal that of g(n)Think of ‘f(n) is in O(g(n))’ as f(n) ≤ g(n) (sort of)or ‘f(n) is in O(g(n))’ as g(n) is an upper-bound for f(n) (sort of)nn0gfBig Oh (cont.)7The Intuition:Take functions f(n) & g(n), consider only the most significant term and remove constant multipliers:5n+3 → n7n+.5n2+2000 → n2300n+12+nlogn → nlogn– n → ??? What does it mean to have a negative run-time?Then compare the functions; if f(n) ≤ g(n), then f(n) is in O(g(n))Do NOT ignore constants that are not multipliers:n3 is O(n2) : FALSE3n is O(2n) : FALSEWhen in doubt, refer to the definitionExamples8True or false?1. 4+3n is O(n)2. n+2logn is O(logn)3. logn+2 is O(1)4. n50 is O(1.1n)TrueFalseFalseTrueExamples (cont.)9For f(n)=4n & g(n)=n2, prove f(n) is in O(g(n))A valid proof is to find valid c & n0 When n=4, f=16 & g=16; this is the crossing over pointSay n0 = 4, and c=1(Infinitely) Many possible choices: ex: n0 = 78, and c=42 works fineThe Definition: f(n) is in O(g(n) ) iff there exist positive constants c and n0 such thatf(n) c g(n) for all n n0.Examples (cont.)10For f(n)=n4 & g(n)=2n, prove f(n) is in O(g(n))Possible answer: n0=20, c=1The Definition: f(n) is in O(g(n) ) iff there exist positive constants c and n0 such thatf(n) c g(n) for all n n0.What’s with the c?11To capture this notion of similar asymptotic behavior, we allow a constant multiplier (called c)Consider:f(n)=7n+5g(n)=nThese have the same asymptotic behavior (linear), so f(n) is in O(g(n)) even though f is always largerThere is no positive n0 such that f(n)≤g(n) for all n≥n0The ‘c’ in the definition allows for thatTo prove f(n) is in O(g(n)), have c=12, n0=1Big Oh: Common Categories12From fastest to slowestO(1) constant (same as O(k) for constant k)O(log n) logarithmicO(n) linearO(n log n) “n log n”O(n2) quadraticO(n3) cubicO(nk) polynomial (where is k is an constant)O(kn) exponential (where k is any constant > 1)Usage note: “exponential” does not mean “grows really fast”, it means “grows at rate proportional to kn for some k>1”A savings account accrues interest exponentially (k=1.01?)Caveats13Asymptotic complexity focuses on behavior for large n and is independent of any computer/coding trick, but results can be misleadingExample: n1/10 vs. log nAsymptotically n1/10 grows more quicklyBut the “cross-over” point is around 5 * 1017So if you have input size less than 258, prefer n1/10Caveats14Even for more common functions, comparing O() for small n values can be misleadingQuicksort: O(nlogn) (expected)Insertion Sort: O(n2)(expected)Yet in reality Insertion Sort is faster for small n’sWe’ll learn about these sorts laterUsually talk about an algorithm being O(n) or whateverBut you can prove bounds for entire problemsEx: No algorithm can do better than logn in the worst case for finding an element in a sorted array, without parallelismMiscellaneous15Not uncommon to evaluate for:Best-caseWorst-case‘Expected case’So we say (3n2+17) is in O(n2) Confusingly, we also say/write:(3n2+17) is O(n2) (3n2+17) = O(n2) But it’s not ‘=‘ as in ‘equality’:We would never say O(n2) = (3n2+17)Analyzing code (“worst case”)16Basic operations take “some amount of” constant time:Arithmetic (fixed-width)Assignment to a variableAccess one Java field or array indexEtc.(This is an approximation of reality: a useful “lie”.)Consecutive statements Sum of timesConditionals Time of test plus slower branchLoops Sum of iterationsCalls Time of call’s bodyRecursionSolve recurrence equation (in a bit)Analyzing code17What is the run-time for the following code when1. for(int i=0;i<n;i++) O(1)2. for(int i=0;i<n;i++) O(i)3. for(int i=0;i<n;i++) for(int j=0;j<n;j++) O(n)O(n)O(n2)O(n3)Big Oh’s Family18Big Oh: Upper bound: O( f(n) ) is the set of all functions asymptotically less than or equal
View Full Document