UA CSC 445 - Asympthotic notations and recursive formulas

Unformatted text preview:

CS445Brief Class NotesIn this note, I will provide some brief descriptions and pointers to thematerial discussed in class. I will try to give some useful information, butthese notes cannot replace being in class and/or read the text.Asympthotic notations and recursive formulas After a brief introduc-tion, we discussed the assymptotic running time notations: big-O, Ωand Θ. We showed seval examples. This material is well covered in thetext-book, and hence we do not provide more details here. We analysedthe running time of a few examples. One, of particular interset, is thefollowingread(n)for( i = 1 ; i ≤ n ; i + + )for( j = 1 ; j ≤ n ; j+ = i )print( “*” ) ;We showed that its running time isPn11/i = O(n log n).Next we moved to resursive formulas. We analysed using the itera-tive metho d (see details below) simple reursions, such as the functionNoNeed(n) .NoNeed(n)read(n)for( i = 1 ; i ≤ n ; i + +Whose reursion formula is T (n) = cn + T (n − 1), (for some constant c)and T (1) = c. Here as easily seen,T (n) = cn + T(n) =T cn + (c(n − 1) + T (n − 1)) =cn + c(n − 1) + c(n − 2) + T (n − 2) = . . . after k stagesc(n+(n−1)+(n−2)+(n−3)+(n−k))+T (n−k) = setting k = n − 1c(n + (n − 1) + (n − 2) + . . . + 2) + T (1) = cnXi=1i = cn(n + 1)/2Hence T (n) = Θ(n2).Master theorem Prof. Kececioglu, who replaced me, introduced the Mas-ter Theorem, which is used to solve a large class of recursion formulas.The textbook covers this area.Counting Sort and Radix Sort I showed Counting and Radix sorts meth-ods. I taught it from the slides, who cover this area very well. Thisis also a good place to thanks Charles E. Leiserson, Piotr Indyk andCarola Wenk, who all prepared previous versions of the slides I amusing.Lower bound on sorting in a comparison model Please look at the slides.SkipList I taught a version of the skip list at which each element is a singlecell (other versions, such as the one from described in the textbook “Structures & Their Algorithms, ( Larry & Denenberg)” at which eachkey is stored in an array. We give intuitive arguments (handwaving)for the following claims, which would be proven formally later.1. The expected number of levels in the SL is O(log n).2. The expected size (memory used) is O(n).3. The expected search t ime is O(log n). This dominates the ex-pected time for insertion and deletion operations, and also ofsuccessor operation. The operations succ(x) finds the smallestelement in the data structure which is strictly larger than x.Analyzing the running time of QuickSort Consider sorting the keys {k1. . . kn}.Assume that We assume that we use a version of QuickSort at which allelements are different, and the probability of each element to be pickedas a pivot is uniform. We showed that the running time is proportionalto the number of pairs of elements which are compared to each other(which happens when one of these elements is the pivot). We define arandom variable Xijwhich is 1 is at some point at the course of thealgorithm kiand kjare compared, and 0 otherwise. Note that E(Xij),the expected value of XijisE(Xij) = 1 · P r(Xij= 1) + 0 · P r(Xij= 0) = P r(Xij= 1) .Note that the running time is O(P1≤i<j≤nXij), and the expected run-ning time isEX1≤i<j≤nXij=X1≤i<j≤nE (Xij) =X1≤i<j≤nP r(Xij= 1).To evaluate Pr(Xij= 1), we only need to note that the algorithmcompares kiand kjif and only if either kior kjwere the first key to bepicked as pivot, from the set {ki, ki+1. . . kj}. Since each of them hasthe same probability to be picked, and there are j − 1 + 1 keys between(and including) kiand kj, we figure that P r(Xij= 1) =2j−i+1. ThusEX1≤i<j≤nXij=X1≤i<j≤n2j − i + 1.Since 1 +12+13+ . . .1n= O(log n ), we conclude thatEX1≤i<j≤nXij=X1≤i<j≤n2j − i + 1= n · O(log n) = O(n log n)Augmenting data structures We demonstrated this structure both onSkipLists and on binary search trees. The idea was to assign extrainformation to each node of the SL. For example, if our goal is to beable to answer how many elements are smaller than a query key x,we maintain with each node c of the SL the fi eld c.n containing thenumber of keys between key [c] and key [next[c]]. To find the numberof elements smaller than x, we perform find(x), and sum the values ofthe fields c.n of the nodes at which the search path turns right at node.These fields can also help is find the k’th smallest element in the list,in time O(log n).Details about this implementation in search trees can be find the chap-ter on augmented data structures in the textbook.—— Material for the final starts here —–Graphs — in general General graphs. The handshaking Lemma. Weproved that in any party there is an even number of people who shackedan odd number of hands. We discussed two main issues to store graphin computers — adjacency matrix and adjutancy lists. We next dis-cussed connectivity in graphs, and Prim algorithms for finding MST(Minimal Spanning Trees)Graphs and Spanning Trees — in general General graphs. The hand-shaking Lemma. We proved that in any party there is an even numberof people who shacked an odd number of hands. We discussed two mainissues to store graph in computers — adjacency matrix and adjutancylists. We next discussed connectivity in graphs, and Prim algorithmsfor finding MST (Minimal Spanning Trees)Shortest Paths in graphs with positive weights. We discussed the “stan-dard” Dijkstra algorithm, for finding the shortest paths from a vertexs to all vertices of a graph at which every edge is associated with apositive weight. See the relevant slides. we discussed separately how tofind the lengthes of these paths, and then how to find the actual path,using the notion of shortest paths trees. We studied BFS as a simplespecial case of Dijkstra’s algorithm for graphs, and showed that in thiscase the running time can be improved by a factor of O(log |V |), sincea queue can replace the heap as the main data structure.Shortest Paths in graphs with positive weights. (5/18/05) Prof. Ke-cecioglu Showed the Bellman-Ford (BF) algorithm for finding the short-est paths from a vertex s to all vertices of a graph at which every edgeis associated with a weight (which can be positive or negative).Again — see the slides. This algorithm also check for the existence ofnegative cycle in the graph.Johnson algorithm We discussed Johnson algorithm for finding the short-est path between every pair of


View Full Document

UA CSC 445 - Asympthotic notations and recursive formulas

Download Asympthotic notations and recursive formulas
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Asympthotic notations and recursive formulas and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Asympthotic notations and recursive formulas 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?