DOC PREVIEW
UMD CMSC 351 - Lecture Notes

This preview shows page 1 out of 3 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 3 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 3 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Lecture Notes CMSC 251HC to HP Reductionbool HamCycle(graph G) {for each edge {u,v} in G {copy G to a new graph G’delete edge {u,v} from G’add new vertices x and y to G’add new edges {x,u} and {y,v} to G’if (HamPath(G’)) return true}return false // failed for every edge}This is a rather inefficient reduction, but it does work. In particular it makes O(e) calls to the Ham-Path() procedure. Can you see how to do it with fewer calls? (Hint: Consider applying this to theedges coming out of just one vertex.) Can you see how to do it with only one call? (Hint: This istrickier.)As before, notice that we didn’t really attempt to solve either problem. We just tried to figure out howto make a procedure for one problem (Hamiltonian path) work to solve another problem (Hamiltoniancycle). Since HC is NP-complete, this means that there is not likely to be an efficient solution to HPeither.Lecture 29: Final Review(Tuesday, May 12, 1998)Final exam: As mentioned before, the exam will be comprehensive, but it will stress material since thesecond midterm exam. I would estimate that about 50–70% of the exam will cover material since thelast midterm, and the remainder will be comprehensive. The exam will be closed book/closed noteswith three sheets of notes (front and back).Overview: This semester we have discussed general approaches to algorithm design. The goal of this courseis to improve your skills in designing good programs, especially on complex problems, where it is notobvious how to design a good solution. Finding good computational solutions to problems involvesmany skills. Here we have focused on the higher level aspects of the problem: what approaches to usein designing good algorithms, how generate a rough sketch the efficiency of your algorithm (throughasymptotic analysis), how to focus on the essential mathematical aspects of the problem, and strip awaythe complicating elements (such as data representations, I/O, etc.)Of course, to be a complete programmer, you need to be able to orchestrate all of these elements. Themain thrust of this course has only been on the initial stages of this design process. However, these areimportant stages, because a poor initial design is much harder to fix later. Still, don’t stop with yourfirst solution to any problem. As we saw with sorting, there may be many ways of solving a problem.Even algorithms that are asymptotically equivalent (such as MergeSort, HeapSort, and QuickSort) haveadvantages over one another.The intent of the course has been to investigate basic techniques for algorithm analysis, various algo-rithm design paradigms: divide-and-conquer graph traversals, dynamic programming, etc. Finally wehave discussed a class of very hard problems to solve, called NP-complete problems, and how to showthat problems are in this class. Here is an overview of the topics that we covered this semester.Tools of Algorithm Analysis:91Lecture Notes CMSC 251Asymptotics: O, Ω, Θ. General facts about growth rates of functions.Summations: Analysis of looping programs, common summations, solving complex summa-tions, integral approximation, constructive induction.Recurrences: Analysis of recursive programs, strong induction, expansion, Master Theorem.Sorting:Mergesort: Stable, Θ(n log n) sorting algorithm.Heapsort: Nonstable, Θ(n log n), in-place algorithm. A heap is an important data structure forimplementation of priority queues (a queue in which the highest priority item is dequeuedfirst).Quicksort: Nonstable, Θ(n log n) expected case, (almost) in-place sorting algorithm. This isregarded as the fastest of these sorting algorithms, primarily because of its pattern of localityof reference.Sorting lower bounds: Any sorting algorithm that is based on comparisons requires Ω(n log n)steps in the worst-case. The argument is based on a decision tree. Considering the numberof possible outcomes, and observe that they form the leaves of the decision tree. The heightof the decision tree is Ω(lg N ), where N is the number of leaves. In this case, N = n!, thenumber of different permutations of n keys.Linear time sorting: If you are sorting small integers in the range from 1 to k, then you canapplying counting sort in Θ(n + k) time. If k is too large, then you can try breaking thenumbers up into smaller digits, and apply radix sort instead. Radix sort just applies countingsort to each digit individually. If there are d digits, then its running time is Θ(d(n + k)),where k is the number of different values in each digit.Graphs: We presented basic definitions of graphs and digraphs. A graph (digraph) consists of a setof vertices and a set of undirected (directed) edges. Recall that the number of edges in a graphcan generally be as large as O(n2), but is often smaller (closer to O(n)). A graph is sparse if thenumber of edges is o(n2), and dense otherwise.We discussed two representations:Adjacency matrix: A[u, v]=1if (u, v) ∈ E. These are simple, but require Θ(n2) storage.Good for dense graphs.Adjacency list: Adj [u] is a pointer to a linked list containing the neighbors of u. These are betterfor sparse graphs, since they only require Θ(n + e) storage.Breadth-first search: We discussed one graph algorithm: breadth first search. This is a way oftraversing the vertices of a graph in increasing order of distance from a source vertex. Recallthat it colors vertices (white, gray, black) to indicate their status in the search, and it also uses aFIFO queue to determine which order it will visit the vertices. When we process the next vertex,we simply visit (that is, enqueue) all of its unvisited neighbors. This runs in Θ(n + e) time. (Ifthe queue is replaced by a stack, then we get a different type of search algorithm, called depth-first search.) We showed that breadth-first search could be used to compute shortest paths from asingle source vertex in an (unweighted) graph or digraph.Dynamic Programming: Dynamic programming is an important design technique used in many op-timization problems. Its basic elements are those of subdividing large complex problems intosmaller subproblems, solving subproblems in a bottom-up manner (going from smaller to larger).An important idea in dynamic programming is that of the principal of optimality: For the globalproblem to be solved optimally, the subproblems should be solved optimally. This is not alwaysthe case (e.g., when there is dependence between the subproblems, it might be better to do worseand one to get a big


View Full Document
Download Lecture Notes
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Lecture Notes and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Lecture Notes 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?