DOC PREVIEW
UW CSE 332 - Asymptotic Analysis

This preview shows page 1-2-14-15-29-30 out of 30 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 30 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 30 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 30 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 30 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 30 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 30 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 30 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

CSE332: Data AbstractionsLecture 3: Asymptotic AnalysisTyler RobisonSummer 20101Overview2 Asymptotic analysis Why we care Big Oh notation Examples Caveats & miscellany Evaluating an algorithm Big Oh‟s family Recurrence relationsWhat do we want to analyze?3 Correctness Performance: Algorithm‟s speed or memory usage: our focus Change in speed as the input grows n increases by 1 n doubles Comparison between 2 algorithms Security ReliabilityGauging performance4 Uh, why not just run the program and time it? Too much variability; not reliable: Hardware: processor(s), memory, etc. OS, version of Java, libraries, drivers Programs running in the background Implementation dependent Choice of input Timing doesn‟t really evaluate the algorithm; it evaluates its implementation in one very specific scenarioGauging performance (cont.)5 At the core of CS is a backbone of theory & mathematics Examine the algorithm itself, mathematically, not the implementation Reason about performance as a function of n Be able to mathematically prove things about performance Yet, timing has its place In the real world, we do want to know whether implementation A runs faster than implementation B on data set C Ex: Benchmarking graphics cards Will do some timing in project 3 (and in 2, a bit) Evaluating an algorithm? Use asymptotic analysis Evaluating an implementation of hardware/software? Timing can be usefulBig-Oh6 Say we‟re given 2 run-time functions f(n) & g(n) for input n The Definition: f(n) is in O(g(n) ) iff there exist positive constants cand n0such thatf(n)  c g(n), for all n  n0. The Idea: Can we find an n0 such that g is always greater than f from there on out?We are allowed to multiply g by a constant value (say, 10) to make g largerO(g(n)) is really a set of functions whose asymptotic behavior is less than or equal that of g(n)Think of „f(n) is in O(g(n))‟ as f(n) ≤ g(n) (sort of)or ‘f(n) is in O(g(n))’ as g(n) is an upper-bound for f(n) (sort of)nn0gfBig Oh (cont.)7 The Intuition: Take functions f(n) & g(n), consider only the most significant term and remove constant multipliers: 5n+3 → n 7n+.5n2+2000 → n2 300n+12+nlogn → nlogn – n → ??? What does it mean to have a negative run-time? Then compare the functions; if f(n) ≤ g(n), then f(n) is in O(g(n)) Do NOT ignore constants that are not multipliers: n3is O(n2) : FALSE 3nis O(2n) : FALSE When in doubt, refer to the definitionExamples8 True or false?1. 4+3n is O(n)2. n+2logn is O(logn)3. logn+2 is O(1)4. n50is O(1.1n)TrueFalseFalseTrueExamples (cont.)9 For f(n)=4n & g(n)=n2, prove f(n) is in O(g(n)) A valid proof is to find valid c & n0 When n=4, f=16 & g=16; this is the crossing over point Say n0= 4, and c=1 (Infinitely) Many possible choices: ex: n0= 78, and c=42 works fineThe Definition: f(n) is in O(g(n) ) iff there exist positive constants cand n0such thatf(n)  c g(n) for all n  n0.Examples (cont.)10 For f(n)=n4& g(n)=2n, prove f(n) is in O(g(n)) Possible answer: n0=20, c=1The Definition: f(n) is in O(g(n) ) iff there exist positive constants cand n0such thatf(n)  c g(n) for all n  n0.What’s with the c?11 To capture this notion of similar asymptotic behavior, we allow a constant multiplier (called c) Consider:f(n)=7n+5g(n)=n These have the same asymptotic behavior (linear), so f(n) is in O(g(n)) even though f is always larger There is no positive n0such that f(n)≤g(n) for all n≥n0 The „c‟ in the definition allows for that To prove f(n) is in O(g(n)), have c=12, n0=1Big Oh: Common Categories12From fastest to slowestO(1) constant (same as O(k) for constant k)O(log n) logarithmicO(n) linearO(n log n) “n log n”O(n2) quadraticO(n3) cubicO(nk) polynomial (where is k is an constant)O(kn) exponential (where k is any constant > 1)Usage note: “exponential” does not mean “grows really fast”, it means “grows at rate proportional to knfor some k>1” A savings account accrues interest exponentially (k=1.01?)Caveats13 Asymptotic complexity focuses on behavior for large n and is independent of any computer/coding trick, but results can be misleading Example: n1/10vs. log n Asymptotically n1/10grows more quickly But the “cross-over” point is around 5 * 1017 So if you have input size less than 258, prefer n1/10Caveats14 Even for more common functions, comparing O() for small n values can be misleading Quicksort: O(nlogn) (expected) Insertion Sort: O(n2)(expected) Yet in reality Insertion Sort is faster for small n‟s We‟ll learn about these sorts later Usually talk about an algorithm being O(n) or whatever But you can prove bounds for entire problems Ex: No algorithm can do better than logn in the worst case for finding an element in a sorted array, without parallelismMiscellaneous15 Not uncommon to evaluate for: Best-case Worst-case „Expected case‟ So we say (3n2+17) is in O(n2)  Confusingly, we also say/write: (3n2+17) is O(n2)  (3n2+17) = O(n2)  But it‟s not „=„ as in „equality‟: We would never say O(n2) = (3n2+17)Analyzing code (“worst case”)16Basic operations take “some amount of” constant time: Arithmetic (fixed-width) Assignment to a variable Access one Java field or array index Etc.(This is an approximation of reality: a useful “lie”.)Consecutive statements Sum of timesConditionals Time of test plus slower branchLoops Sum of iterationsCalls Time of call‟s bodyRecursion Solve recurrence equation (in a bit)Analyzing code17What is the run-time for the following code when1. for(int i=0;i<n;i++) O(1)2. for(int i=0;i<n;i++) O(i)3. for(int i=0;i<n;i++) for(int j=0;j<n;j++) O(n)O(n)O(n2)O(n3)Big Oh’s Family18 Big Oh: Upper bound: O( f(n) ) is the set of all functions asymptotically less than or equal to f(n) g(n) is in O( f(n) ) if there exist constants c and n0such that g(n)  c f(n) for all n  n0 Big Omega: Lower bound: ( f(n) ) is the set of all functions asymptotically greater than or equal to f(n) g(n) is in ( f(n) ) if there exist constants c and n0such that g(n)  c f(n) for all n  n0 Big Theta: Tight bound: ( f(n) ) is the set of all functions asymptotically equal to f(n) Intersection of O( f(n) ) and ( f(n) ) (use


View Full Document

UW CSE 332 - Asymptotic Analysis

Download Asymptotic Analysis
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Asymptotic Analysis and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Asymptotic Analysis 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?