DOC PREVIEW
UCF COP 3502H - Computational Complexity

This preview shows page 1-2-3-4 out of 13 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 13 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 13 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 13 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 13 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 13 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Computational ComplexityAlgorithmAlgorithmBetter AlgorithmAnalysisComputational ComplexityAsymptotic NotationBig Oh NotationComputational ComplexityAlgorithm AnalysisAlgorithm - a clearly specified set of instructions that the computer will follow to solvea problem.Algorithm Analysis - determining the amount of resources that the algorithm willrequire, typically in terms of time and space.Areas of study include:- Estimation techniques for determining the running time of an algorithm.- Techniques to reduce the running time of an algorithm.- Mathematical framework for the accurate determination of the running time of analgorithm.Algorithm Analysis- The running time of an algorithm is a function of the size of the input. Example: Ittakes longer to sort 1000 numbers than it does to sort 10 numbers. - The value of this function depends upon many things including: 1. The speed of the host computer.2. The size of the host computer.3. The compilation process (quality of the compiler generated code).4. The quality of the original source code which implements the algorithm.Illustration of running time vs input size for small input sets (corrected Figure 5.1) linear N log N quadratictime cubic constantN 1Computational ComplexityWhen comparing two functions F(N) and G(N), it does not make sense to state that: F <G, F = G, or G < F. Example: At some point x, F may be smaller than G, yet at someother point y, F may be equal to or greater than G. Instead, the growth rates of thefunctions need to be determined. Definitions (based on the growth rate of the function):1. constant function – function whose dominant term is a constant (c)2. logarithmic func. – dominant term is log N3. log-squared func. – dominant term is log2 N4. linear func. – dominant term is N5. N log N func. – dominant term is N log N6. quadratic func. – dominant term is N27. cubic func. – dominant term is N38. exponential func. – dominant term is 2NThere is a three-fold reason for basing our analysis on the growth rate of the functionrather than its specific value at some point: 1. For sufficiently large values of N, the value of the function is primarily determinedby its dominant term (sufficiently large varies by function). Example: Consider the cubic function where the function is expressed by 15N3 + 20N2 - 10N + 4.For large values of N, say 1000, the value of this function is: 15,019,990,004 of which15,000,000,000 is due entirely to the N3 term. Using only the N3 term to estimate thevalue of this function introduces an error of only 0.1% which is typically close enoughfor estimation purposes.2. Constants associated with the dominant term are usually not meaningful acrossdifferent machines (maybe though for identically growing functions).3. Small values of N are generally not important. Big-Oh Notation- Used to represent the growth rate of a function.- Allows algorithm designers to establish a relative order among functions bycomparison of their dominant terms.- Denoted as O(N2), read as "order N squared".- Examine this notation and related notation more formally a bit later. (section 5.4). 2Definitions (based on the big-oh of the function):1. constant function – O(1)2. logarithmic func. – O(log N)3. log-squared func. – O(log2 N)4. linear func. – O(N)5. N log N func. – O(N log N)6. quadratic func. – O(N2)7. cubic func. – O(N3)8. exponential func. – O(2N)Corrected Figure5.2 Running Time for Moderate Size Input Sets linear N log N quadratictime cubic constantNExamining Figure 5.1 (running time vs input size for small input sets) illustrates thatthere are points at which one curve is initially better (produces a shorter running time)than another curve, even though eventually (as the input set size grows) this is not thecase. For example, the quadratic curve is initially better than the NlogN curve, but as Nbegins to grow the quadratic curve produces a longer run time.For small input sets it is difficult to compare the various functions because the values ofthe leading constants become very significant. Example: the function F(N) = N + 2500 is always larger than the function G(N) = N2when N is less than 50. Eventually, the linear function F(N) will always be smaller thanthe quadratic function G(N) when N gets sufficiently large so as to outweigh theconstant. In this case when N = 51, F(N) = 2551 and G(N) = 2601. 3Asymptotic NotationBig Oh NotationDefinition: Let p(n) and q(n) be two nonnegative functions. The function p(n) isasymptotically bigger [p(n) asymptotically dominates q(n)] than the function q(n) iffThe function q(n) is asymptotically smaller than p(n) iff p(n) is asympotically biggerthan q(n). Functions p(n) and q(n) are asymptotically equal iff neither is asymptoticallybigger than the other.Example 1: Let p(n) = 3n2 + 2n + 6 and q(n) = 10n + 7.divide both functions by n2 (to reduce dominant term to a constant) which will produce:Thus, 3n2 + 2n + 6 is asymptotically bigger than 10n + 7.Similarly, 10n + 7 is asymptotically smaller than 3n2 + 2n + 6.Example 2:Let p(n) = 6n + 2 and q(n) = 12n + 6Try: Show that 8n4 + 9n2 is asymptotically bigger than 100n3 – 3. Also do the same for:2n2 + 3n and 83n. 40)n(p)n(qlimn2/112/6n/612n2/6)nbydivide(6n122n6lim23/6n/26n/612)nbydivide(2n66n12limnn6n2n37n10lim2n03/0n/6n/23n/7n/102208/0n/98n/3n/100n9n83n100lim24243nA similar technique will show the same is true for the functions 2n2 + 3n and 83n.Term Name1 ConstantLog N LogarithmicN LinearN log N N log NN2QuadraticN3Cubic2NExponentialN! FactorialAsymptotic notation describes the behavior of the time or space complexity for largeinstance characteristics. When the instance characteristic is described by a singlevariable, say n, asymptotic notation describes the complexity using a single term, theasymptotically biggest term in the function (step count of algorithm steps).Big-Oh Notation (O)Notation: f(n) = O(g(n)) [read as f(n) is big-oh of g(n)] means that f(n) is asymptoticallysmaller than or equal to g(n).Meaning: g(n) establishes an upper bound on f(n). The asymptotic growth rate of thefunction f(n) is bounded from above by g(n). t- cg(n) m f(n)n-g(n) is an upper bound on


View Full Document

UCF COP 3502H - Computational Complexity

Download Computational Complexity
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Computational Complexity and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Computational Complexity 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?