11Algorithm Analysis2PurposeWhy bother analyzing code; isn’t getting it to work enough?Estimate time and memory in the average case and worst caseIdentify bottlenecks, i.e., where to reduce timeSpeed up critical algorithms3AlgorithmAlgorithmWell-defined computational procedure for transforming inputs to outputsProblemSpecifies the desired input-output relationshipCorrect algorithmProduces the correct output for every possible input in finite timeSolves the problem4Algorithm AnalysisPredict resource utilization of an algorithmRunning timeMemoryDependent on architectureSerialParallelQuantumFactors for Algorithmic Design ConsiderationRun-timeSpace/memory Computational modelSuitability to the problem’s application domain (contextual relevance)ScalabilityGuaranteeing correctnessDeterministic vs. randomizedSystem considerations: cache, disk, network speeds, etc.Model that we will assumeSimple serial computing modelcontrolALUregistersCPURAM(mainmemory)I/OSecondary storage(disk)Keyboard, mouseMonitor, printerINPUTOUTPUTdevices7Other ModelsMulti-core modelsCo-processor modelMulti-processor modelShared memory machineMultiple processors sharing common RAMMemory is cheap ($20 per GB)Cray XMT Supercomputer•Up to 64 TB (65,536 GB) shared memory•Up to 8000 processors•128 independent threads per processor•$150M8Other ModelsDistributed memory modelwww.top500.orgFastest supercomputer (as of June 2008):• IBM Roadrunner @ Los Alamos National Lab• 1.026 PetaFlop/s (or 1.026 x 1015) (floating points operations per sec)•IBM QS22 blades built with Sony Playstation 3•#processing cores: 122,400•#aggregate memory: 98 terabytes• Price tag: $100 million only!! Supercomputer speed is measured in number offloating point operations per sec(or FLOPS)9What to AnalyzeRunning time T(N)N (or sometimes n) is typically the size of the inputSorting?Multiplying two integers?Multiplying two matrices?Traversing a graph?T(N) measures number of primitive operations performedE.g., addition, multiplication, comparison, assignment10Example for calculating T(n)#operationsint sum (int n) 0{int partialSum; 0partialSum = 0; 1for (int i = 1; i <= n; i++) 1+(n+1)+npartialSum += i * i * i; n*(1+1+2)return partialSum; 1}T(n) = 6n+411Example: Another less-precise but equally informative analysis#operationsint sum (int n) 0{int partialSum; 0partialSum = 0; 1for (int i = 1; i <= n; i++) npartialSum += i * i * i; 1return partialSum; 1}T(n) nWhat happens if:N is small (< 100)?N is medium (<1,000)?N is large (> 1,000,000)?Asymptotically, curves matter more than absolute values!Compare T1(N) vs T2(N)?T1(N) = 3N2+ 100N + 1T1(N) = 50N2+ N + 500Do constants matter?13Rate of GrowthExact expressions for T(N) becomes hard to compare as N growsRate of growthAsymptotic behavior of T(N) as N gets bigUsually expressed as fastest growing term in T(N), dropping constant coefficientsE.g., T(N) = 3N2+ N + 1 Θ(N2)Algorithmic Notation & AnalysisBig-O O()Omega Ω()small-O o()small-omega w()Theta ()Worst-caseAverage-caseBest-case “Algorithms”Lower-bound Upper-boundTight-bound“Optimality”Describes the inputDescribes the problem &its solutionAsymptotic notations that help us quantify & compare costsAsymptotic NotationThetaTight boundBig-oh Upper boundOmegaLower boundOff-class reference in eLearning (see “Asymptotic Handout”)Under the “Web Links” tabBig-oh : O()f(N) = O(g(N)) if there exist positive constants c and n0such that:f(N) ≤ c*g(N) when N>n0Asymptotic upper bound, possibly tight≤Upper bound for f(N)Example for big-OhE.g., f(n)=10 n==> Is f(n) = O(n2)?Steps:If f(n) = O(g(n)) ==> f(n) ≤ c g(n) , for all n>n0(just show that such c and n0exists)==> c=1, n0= 10(Remember: always try to find the lowest possible n0)g(n) = n2f(n) = 10 nn0…..…1211101110010010..…..2550516404930342021101n210 nnBreakevenpoint (n0)Ponder thisIf f(n) = 10 n, then:Is f(n) = O(n)?Is f(n) = O(n2)?Is f(n) = O(n3)?Is f(n) = O(n4)?Is f(n) = O(2n)?…If all of the above, then what is the best answer?Little-oh : o()f(N) = o(g(N)) if there exist positive constants c and n0such that:f(N) < c*g(N) when N>n0E.g., f(n)=10 n; g(n) = n2==> f(n) = o(g(n))<Strict upper bound for f(N)Big-omega : Ω ()f(N) = Ω(g(N)) if there exist positive constants c and n0such that:f(N) ≥ c*g(N) when N>n0E.g., f(n)= n2; g(n) =n log n==> f(n) = Ω (g(n))≥Lower bound for f(N), possibly tightLittle-omega : ω()f(N) = ω(g(N)) if there exist positive constants c and n0such that:f(N) > c*g(N) when N>n0E.g., f(n)= 100 n2; g(n) =n==> f(n) = ω (g(n))>Strict lower bound for f(N)22Theta : Θ()f(N) = Θ(h(N)) if there exist positive constants c1,c2and n0such that:c1h(N) ≤f(N) ≤ c2h(N) for all N>n0(≡): f(N) = Θ(h(N)) if and only if f(N) = O(h(N)) and f(N) = Ω(h(N))≡Tight bound for f(N)Example (for theta)f(n) = n2– 2n f(n) = Θ(?)Guess: f(n) = Θ(n2)Verification:Can we find valid c1, c2, and n0?If true: c1n2≤f(n) ≤ c2n2 …The Asymptotic Notationsf(n) > c g(n)Lower bound, strictω(n)f(n) < c g(n)Upper bound, stricto(n)f(n) ≥ c g(n)Lower bound, possibly tightΩ(n)c1g(n) ≤ f(n) ≤ c2g(n)Tight boundΘ(n)f(n) ≤ c g(n)Upper bound, possibly tightO(n)DefinitionPurposeNotationAsymptotic Growthsc g(n)n0nf(n) = O( g(n) )c g(n)f(n)n0nf(n) = Ω( g(n) )c2g(n)n0nf(n) = Θ( g(n) )f(n)c1g(n)f(n)Rules of Thumb while usingAsymptotic NotationsAlgorithm’s complexity:When asked to analyze an algorithm’s complexities:1stPreference: Whenever possible, use Θ()2ndPreference:If not, use O() - or o()3rdPreference:If not, use Ω() - or ω()Tight boundUpper boundLower boundRules of Thumb while usingAsymptotic Notations…Algorithm’s complexity:Always express an algorithm’s complexity in terms of its worst-case, unless otherwise specifiedNote: worst-case can itself be expressed in any of the asymptotic notations: Θ, O, Ω, ωor o.Rules of Thumb while usingAsymptotic Notations…Problem’s complexityWays to answer a problem’s complexity:Q1) This problem is at least as hard as … ?Use lower bound hereQ2) This problem cannot be harder than … ?Use upper bound hereQ3) This problem is as
View Full Document