DOC PREVIEW
MSU CSE 830 - Lecture2

This preview shows page 1-2-3-24-25-26 out of 26 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 26 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 26 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 26 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 26 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 26 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 26 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 26 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Asymptotic AnalysisMotivationExact Analysis is Hard!Even Harder Exact AnalysisSimplificationsWhy ignore constants?Slide 7“Big Oh” NotationSet Notation CommentThree Common SetsO(f(n))(f(n))(f(n))Slide 14O(f(n)) and (f(n))Example FunctionQuick Questions“Little Oh” NotationTwo Other SetsCommon Complexity FunctionsComplexity GraphsSlide 22Slide 23Complexity Graphs (log scale)LogarithmsExample ProblemsAsymptotic Analysis•Motivation•Definitions•Common complexity functions•Example problemsMotivation•Lets agree that we are interested in performing a worst case analysis of algorithms•Do we need to do an exact analysis?Exact Analysis is Hard!Even Harder Exact AnalysisSimplifications•Ignore constants•Asymptotic EfficiencyWhy ignore constants?•Implementation issues (hardware, code optimizations) can speed up an algorithm by constant factors–We want to understand how effective an algorithm is independent of these factors•Simplification of analysis–Much easier to analyze if we focus only on n2 rather than worrying about 3.7 n2 or 3.9 n2Asymptotic Analysis•We focus on the infinite set of large n ignoring small values of n•Usually, an algorithm that is asymptotically more efficient will be the best choice for all but very small inputs.0infinity“Big Oh” Notation•O(f(n)) = {g(n) : there exists positive constants c and n0 such that 0 <= g(n) <= c f(n) }–What are the roles of the two constants?•n0:•c:Set Notation Comment•O(f(n)) is a set of functions.•However, we will use one-way equalities like n = O(n2)•This really means that function n belongs to the set of functions O(n2)•Incorrect notation: O(n2) = n•Analogy–“A dog is an animal” but not “an animal is a dog”Three Common Setsg(n) = O(f(n)) means c  f(n) is an Upper Bound on g(n)g(n) = (f(n)) means c  f(n) is a Lower Bound on g(n)g(n) = (f(n)) means c1  f(n) is an Upper Bound on g(n) and c2  f(n) is a Lower Bound on g(n)These bounds hold for all inputs beyond some threshold n0.O(f(n))(f(n))(f(n))(f(n))O(f(n)) and (f(n))21001 ))(( nnfO nnf251 ))(( Example Functionf(n) = 3n2 - 100n + 6Quick Questions c n0 3n2 - 100n + 6 = O(n2)3n2 - 100n + 6 = O(n3) 3n2 - 100n + 6  O(n) 3n2 - 100n + 6 = (n2)3n2 - 100n + 6  (n3)3n2 - 100n + 6 = (n)3n2 - 100n + 6 = (n2)?3n2 - 100n + 6 = (n3)?3n2 - 100n + 6 = (n)?“Little Oh” Notation•o(g(n)) = {f(n) : for any positive constant c >0, there exists a constant n0 > 0 such that 0 <= f(n) < cg(n) for all n >= n0}–Intuitively, limn f(n)/g(n) = 0–f(n) < c g(n)Two Other Setsg(n) = o(f(n)) means c  f(n) is a strict upper bound on g(n)g(n) = (f(n)) means c  f(n) is a strict lower bound on g(n)These bounds hold for all inputs beyond some threshold n0 where n0 is now dependent on c.Common Complexity FunctionsComplexity 10 20 30 40 50 60n 110-5 sec 210-5 sec 310-5 sec 410-5 sec 510-5 sec 610-5 secn2 0.0001 sec 0.0004 sec 0.0009 sec 0.016 sec 0.025 sec 0.036 secn3 0.001 sec 0.008 sec 0.027 sec 0.064 sec 0.125 sec 0.216 secn5 0.1 sec 3.2 sec 24.3 sec 1.7 min 5.2 min 13.0 min2n 0.001sec 1.0 sec 17.9 min 12.7 days 35.7 years 366 cent3n 0.59sec 58 min 6.5 years 3855 cent 2108cent 1.31013cent log2 n 310-6 sec 410-6 sec 510-6 sec 510-6 sec 610-6 sec 610-6 secn log2 n 310-5 sec 910-5 sec 0.0001 sec 0.0002 sec 0.0003 sec 0.0004 secComplexity Graphslog(n)nComplexity Graphslog(n)nnn log(n)Complexity Graphsn10n log(n)n3n2Complexity Graphs (log scale)n10n20nn1.1n2n3nLogarithmsProperties:bx = y  x = logbyblogbx = xlogab = b logalogax = c logbx (where c = 1/logba)Questions:* How do logan and logbn compare?* How can we compare n logn with n2?Example Problems1. What does it mean if:f(n)  O(g(n)) and g(n)  O(f(n)) ???2. Is 2n+1 = O(2n) ? Is 22n = O(2n) ?3. Does f(n) = O(f(n)) ?4. If f(n) = O(g(n)) and g(n) = O(h(n)), can we say f(n) = O(h(n))


View Full Document

MSU CSE 830 - Lecture2

Download Lecture2
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Lecture2 and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Lecture2 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?