DOC PREVIEW
UMD CMSC 351 - Lecture Notes

This preview shows page 1 out of 3 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 3 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 3 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Lecture Notes CMSC 251Divide-and-conquer: Understand how to design algorithms by divide-and-conquer. Understand thedivide-and-conquer algorithm for MergeSort, and be able to work an example by hand. Alsounderstand how the sieve technique works, and how it was used in the selection problem. (Chapt10 on Medians; skip the randomized analysis. The material on the 2-d maxima and long integermultiplication is not discussed in CLR.)Lecture 11: First Midterm Exam(Tuesday, March 3, 1998)First midterm exam today. No lecture.Lecture 12: Heaps and HeapSort(Thursday, Mar 5, 1998)Read: Chapt 7 in CLR.Sorting: For the next series of lectures we will focus on sorting algorithms. The reasons for studying sortingalgorithms in details are twofold. First, sorting is a very important algorithmic problem. Proceduresfor sorting are parts of many large software systems, either explicitly or implicitly. Thus the design ofefficient sorting algorithms is important for the overall efficiency of these systems. The other reason ismore pedagogical. There are many sorting algorithms, some slow and some fast. Some possess certaindesirable properties, and others do not. Finally sorting is one of the few problems where there provablelower bounds on how fast you can sort. Thus, sorting forms an interesting case study in algorithmtheory.In the sorting problem we are given an array A[1..n] of n numbers, and are asked to reorder theseelements into increasing order. More generally, A is of an array of records, and we choose one of theserecords as the key value on which the elements will be sorted. The key value need not be a number. Itcan be any object from a totally ordered domain. Totally ordered means that for any two elements ofthe domain, x, and y, either x<y,x=,orx>y.There are some domains that can be partially ordered, but not totally ordered. For example, sets canbe partially ordered under the subset relation, ⊂, but this is not a total order, it is not true that for anytwo sets either x ⊂ y, x = y or x ⊃ y. There is an algorithm called topological sorting which can beapplied to “sort” partially ordered sets. We may discuss this later.Slow Sorting Algorithms: There are a number of well-known slow sorting algorithms. These include thefollowing:Bubblesort: Scan the array. Whenever two consecutive items are found that are out of order, swapthem. Repeat until all consecutive items are in order.Insertion sort: Assume that A[1..i − 1] have already been sorted. Insert A[i] into its proper positionin this subarray, by shifting all larger elements to the right by one to make space for the new item.Selection sort: Assume that A[1..i − 1] contain the i − 1 smallest elements in sorted order. Find thesmallest element in A[i..n], and then swap it with A[i].These algorithms are all easy to implement, but they run in Θ(n2) time in the worst case. We havealready seen that MergeSort sorts an array of numbers in Θ(n log n) time. We will study two others,HeapSort and QuickSort.39Lecture Notes CMSC 251Priority Queues: The heapsort algorithm is based on a very nice data structure, called a heap. A heap isa concrete implementation of an abstract data type called a priority queue. A priority queue storeselements, each of which is associated with a numeric key value, called its priority. A simple priorityqueue supports three basic operations:Create: Create an empty queue.Insert: Insert an element into a queue.ExtractMax: Return the element with maximum key value from the queue. (Actually it is morecommon to extract the minimum. It is easy to modify the implementation (by reversing < and >to do this.)Empty: Test whether the queue is empty.Adjust Priority: Change the priority of an item in the queue.It is common to support a number of additional operations as well, such as building a priority queuefrom an initial set of elements, returning the largest element without deleting it, and changing thepriority of an element that is already in the queueu (either decreasing or increasing it).Heaps: A heap is a data structure that supports the main priority queue operations (insert and delete max)in Θ(log n) time. For now we will describe the heap in terms of a binary tree implementation, but wewill see later that heaps can be stored in arrays.By a binary tree we mean a data structure which is either empty or else it consists of three things: aroot node, a left subtree and a right subtree. The left subtree and right subtrees are each binary trees.They are called the left child and right child of the root node. If both the left and right children of anode are empty, then this node is called a leaf node. A nonleaf node is called an internal node.Complete Binary TreeBinary TreeRootDepth:234510internalnodeleafFigure 10: Binary trees.The depth of a node in a binary tree is its distance from the root. The root is at depth 0, its childrenat depth 1, its grandchildren at depth 2, and so on. The height of a binary tree is its maximum depth.Binary tree is said to be complete if all internal nodes have two (nonempty) children, and all leaveshave the same depth. An important fact about a complete binary trees is that a complete binary tree ofheight h hasn =1+2+...+2h=hXi=02i=2h+1− 1nodes altogether. If we solve for h in terms of n, we see that the the height of a complete binary treewith n nodes is h = (lg(n + 1)) − 1 ≈ lg n ∈ Θ(log n).40Lecture Notes CMSC 251A heap is represented as an left-complete binary tree. This means that all the levels of the tree are fullexcept the bottommost level, which is filled from left to right. An example is shown below. The keys ofa heap are stored in something called heap order. This means that for each node u, other than the root,key(Parent(u)) ≥ key(u). This implies that as you follow any path from a leaf to the root the keysappear in (nonstrict) increasing order. Notice that this implies that the root is necessarily the largestelement.4Heap orderingLeft−complete Binary Tree143169108712Figure 11: Heap.Next time we will show how the priority queue operations are implemented for a heap.Lecture 13: HeapSort(Tuesday, Mar 10, 1998)Read: Chapt 7 in CLR.Heaps: Recall that a heap is a data structure that supports the main priority queue operations (insert andextract max) in Θ(log n) time each. It consists of a left-complete binary tree (meaning that all levels ofthe tree except possibly the bottommost) are full, and the bottommost level is filled from left to right.As a consequence, it follows


View Full Document
Download Lecture Notes
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Lecture Notes and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Lecture Notes 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?