On Time Versus Input SizeCarnegie Mellon UniversityNovember 14, 2006Lecture 23CS 15-251 Fall 2006John LaffertyGreat Theoretical Ideas In Computer Science# of bitstimeRecall From Last Timeā¦State transition instructionsa b #x 1A finite alphabetA set of accepting statesA start stateFinite set of states1 2{ , , , , }o kQ q q q q=KFinite Automatonoq{}1 2, , ,ri i iF q q q= Kā:( , )i jQ Qq a qā ĆĪ£ āā =iqjqaRemember āABAā?bbabaaababABA accepts only the strings with an equal number of abās and baās!{}, , , ,n na b ab aabb aaabbbĪµ=KConsider the languagei.e., a bunch of aāsfollowed by an equal number of bāsNo finite automaton accepts this language.Theorem: anbnis not regular.Proof: Assume that it is. Then āMwith kstates that accepts it.For each 0 ā¤iā¤k, let Sibe the state Mis in after reading ai.āi,jā¤ks.t. Si= Sj, but i ā jM will do the same thing on aibiand ajbi.But a valid M must reject ajbiand accept aibi.āāMORAL:Finite automata canāt count.But be careful: Automata can indeed do some kinds of implicitcounting.Nondeterministic Finite State AutomataTwo differences:1. Allow āepsilon transitionsā where a state transition is made, but no input symbol is read.2. Allow more than one transition from a state matching the same input symbol.L = all strings containing aba or aab, Īµa,baaa,bRunning an NFA: Parallel ProgrammingNFA = DFAā¢Does this give us any more power?ā¢No, because any NFA can be written as an equivalent DFA. ā¢But they can be much more compact and convenient to work withL = all strings containing ababb as a consecutive substringba,baabbbbaaaaababaababeInvariant: Machine is in state s when s is the longest suffix of the input (so far) that forms a prefix of ababb.L = all strings containing ababb as a consecutive substringba,baaa,bbbNondeterministic machine. Equivalent to regular expression {a,b}*ababb{a,b}*Regular Operatons on LanguagesThe class of regular languages is closed under union, concatenation, and *A U B = { s | s ā A or s ā B }A ā¦ B = { s | s = ab for a ā A, b ā B}A* = {s1s2L sk| siā A, k ā„ 0}Important to note: Īµ ā A*Regular Operatons: Proof by PictureClosed under Union: A U B = { s | s ā A or s ā B }Regular Operatons: Proof by PictureClosed under concatenation:A ā¦ B = { s | s = ab for a ā A, b ā B}Regular Operatons: Proof by PictureClosed under *A* = {s1s2L sk| siā A, k ā„ 0}NFA vs. DFA Summaryā¢Two slightly different descriptions of same set of languagesā¢DFAs: sequential evaluationā¢NFAs: parallel evaluationā¢NFA can be more compact and interpretableā¢Uniqueness of state in DFA can make reasoning more transparent.New Topic: āThe Big Oāā¦and we donāt mean:How to add 2 n-bit numbers.**********************+How to add 2 n-bit numbers.*** *********** **********+How to add 2 n-bit numbers.*** ********* *** *** ********+How to add 2 n-bit numbers.*** ******* *** *** * *** ********+How to add 2 n-bit numbers.*** ***** *** *** * *** * *** ********+How to add 2 n-bit numbers.*** * *** * *** * *** * *** * *** * *** * *** * *** * *** ****+* * āGrade school additionāTime complexity of grade school addition+T(n) = amount of time grade school addition uses to add two n-bit numbersWhat do you mean by ātimeā?* * * * * * * * * ** * * * * * * * * ** * * * * * * * * ** * * * * * * * * * *Our GoalWe want to define ātimeā in a way that transcends implementation details and allows us to make assertions about grade school addition in a very general yet useful way.Roadblock ???A given algorithm will take different amounts of time on the same inputs depending on such factors as:ā Processor speedā Instruction setā Disk speedā Brand of compilerHold on! The goal was to measure the time T(n) taken by the method of grade school addition without depending on the implementation details. But you agree that T(n) does depend on the implementation!We can only speak of the time taken by any particular implementation, as opposed to the time taken by the method in the abstract.Your objections are serious, Bonzo, but they are not insurmountable. There is a very nice sense in which we can analyze grade school addition without having to worry about implementation details.Here is how it works . . .On any reasonable computer, adding 3 bits and writing down the two bit answer can be done in constant time. Pick any particular computer M and define c to be the time it takes to perform on that computer. Total time to add two n-bit numbers using grade school addition: cn[c time for each of n columns]On another computer Mā, the time to perform may be cā.Total time to add two n-bit numbers using grade school addition: cān[cā time for each of n columns]The fact that we get a line is invariant under changes of implementations. Different machines result in different slopes, but time grows linearly as input size increases. # of bits in the numberstimeMachine M: cnMachine Mā: cānThus we arrive at an implementation independent insight: Grade School Addition is a linear time algorithm.I see! We can define away the details of the world that we do not wish to currently study, in order to recognize the similarities between seemingly different thingsā¦AbstractionAbstraction: : Abstract away the inessential Abstract away the inessential features of a problem or solutionfeatures of a problem or solution====Exactly, Bonzo!This process of abstracting away details and determining the rate of resource usagein terms of the problem size nis one of the fundamental ideas in computer science.Time vs Input SizeFor any algorithm, define Input Size = # of bits to specify its inputs.Define TIMEn= the worst-case amount of time used on inputs of size nWe often ask:What is the growth rate of Timen?How to multiply 2 n-bit numbers.X* * * * * * * * * * * * * * * * * * * * * * * ** * * * * * * ** * * * * * * ** * * * * * * ** * * * * * * ** * * * * * * ** * * * * * * ** * * * * * * ** * * * * * * * * * * * * * * *n2How to multiply 2 nHow to multiply 2 nHow to multiply 2 nHow to multiply 2 nHow to multiply 2 nHow to multiply 2 nHow to multiply 2 nHow to multiply 2 n--------bit numbers.bit numbers.bit numbers.bit numbers.bit numbers.bit numbers.bit numbers.bit numbers.X* * * * * * * * * * * * * * * * * * * * * * * ** * * * * * * ** * * * * * * ** * * * * * * ** * * * * * * ** * * * * * * ** * * * * * * ** * * * * * * ** * * * * * * * * * * * * * * *n2I get it!The total time is bounded by
View Full Document