DOC PREVIEW
UMD CMSC 351 - Lecture Notes

This preview shows page 1 out of 4 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 4 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 4 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Lecture Notes CMSC 251A heap is represented as an left-complete binary tree. This means that all the levels of the tree are fullexcept the bottommost level, which is filled from left to right. An example is shown below. The keys ofa heap are stored in something called heap order. This means that for each node u, other than the root,key(Parent (u)) ≥ key(u). This implies that as you follow any path from a leaf to the root the keysappear in (nonstrict) increasing order. Notice that this implies that the root is necessarily the largestelement.4Heap orderingLeft−complete Binary Tree143169108712Figure 11: Heap.Next time we will show how the priority queue operations are implemented for a heap.Lecture 13: HeapSort(Tuesday, Mar 10, 1998)Read: Chapt 7 in CLR.Heaps: Recall that a heap is a data structure that supports the main priority queue operations (insert andextract max) in Θ(log n) time each. It consists of a left-complete binary tree (meaning that all levels ofthe tree except possibly the bottommost) are full, and the bottommost level is filled from left to right.As a consequence, it follows that the depth of the tree is Θ(log n) where n is the number of elementsstored in the tree. The keys of the heap are stored in the tree in what is called heap order. This meansthat for each (nonroot) node its parent’s key is at least as large as its key. From this it follows that thelargest key in the heap appears at the root.Array Storage: Last time we mentioned that one of the clever aspects of heaps is that they can be stored inarrays, without the need for using pointers (as would normally be needed for storing binary trees). Thereason for this is the left-complete nature of the tree.This is done by storing the heap in an array A[1..n]. Generally we will not be using all of the array,since only a portion of the keys may be part of the current heap. For this reason, we maintain a variablem ≤ n which keeps track of the current number of elements that are actually stored actively in theheap. Thus the heap will consist of the elements stored in elements A[1..m].We store the heap in the array by simply unraveling it level by level. Because the binary tree is left-complete, we know exactly how many elements each level will supply. The root level supplies 1 node,the next level 2, then 4, then 8, and so on. Only the bottommost level may supply fewer than theappropriate power of 2, but then we can use the value of m to determine where the last element is. Thisis illustrated below.We should emphasize that this only works because the tree is left-complete. This cannot be used forgeneral trees.41Lecture Notes CMSC 251 3 2 1 5 6 7 8 9 4 13121110671016 14 10 8 7 9 4 2 1 3n=13m=10Heap as an array9895432116Heap as a binary tree1431087142Figure 12: Storing a heap in an array.We claim that to access elements of the heap involves simple arithmetic operations on the array indices.In particular it is easy to see the following.Left (i): return 2i.Right(i): return 2i +1.Parent (i): return bi/2c.IsLeaf (i): return Left(i) >m. (That is, if i’s left child is not in the tree.)IsRoot(i): return i == 1.For example, the heap ordering property can be stated as “for all i, 1 ≤ i ≤ n, if (not IsRoot(i)) thenA[Parent (i)] ≥ A[i]”.So is a heap a binary tree or an array? The answer is that from a conceptual standpoint, it is a binarytree. However, it is implemented (typically) as an array for space efficiency.Maintaining the Heap Property: There is one principal operation for maintaining the heap property. It iscalled Heapify. (In other books it is sometimes called sifting down.) The idea is that we are given anelement of the heap which we suspect may not be in valid heap order, but we assume that all of otherthe elements in the subtree rooted at this element are in heap order. In particular this root element maybe too small. To fix this we “sift” it down the tree by swapping it with one of its children. Which child?We should take the larger of the two children to satisfy the heap ordering property. This continuesrecursively until the element is either larger than both its children or until its falls all the way to theleaf level. Here is the pseudocode. It is given the heap in the array A, and the index i of the suspectedelement, and m the current active size of the heap. The element A[max] is set to the maximum of A[i]and it two children. If max 6= i then we swap A[i] and A[max] and then recurse on A[max].HeapifyHeapify(array A, int i, int m) { // sift down A[i] in A[1..m]l = Left(i) // left childr = Right(i) // right childmax = iif (l <= m and A[l] > A[max]) max = l // left child exists and largerif (r <= m and A[r] > A[max]) max = r // right child exists and largerif (max != i) { // if either child largerswap A[i] with A[max] // swap with larger childHeapify(A, max, m) // and recurse}}42Lecture Notes CMSC 251See Figure 7.2 on page 143 of CLR for an example of how Heapify works (in the case where m =10).We show the execution on a tree, rather than on the array representation, since this is the most naturalway to conceptualize the heap. You might try simulating this same algorithm on the array, to see howit works at a finer details.Note that the recursive implementation of Heapify is not the most efficient. We have done so becausemany algorithms on trees are most naturally implemented using recursion, so it is nice to practice thishere. It is possible to write the procedure iteratively. This is left as an exercise.The HeapSort algorithm will consist of two major parts. First building a heap, and then extracting themaximum elements from the heap, one by one. We will see how to use Heapify to help us do both ofthese.How long does Hepify take to run? Observe that we perform a constant amount of work at each levelof the tree until we make a call to Heapify at the next lower level of the tree. Thus we do O(1) workfor each level of the tree which we visit. Since there are Θ(log n) levels altogether in the tree, the totaltime for Heapify is O(log n). (It is not Θ(log n) since, for example, if we call Heapify on a leaf, thenit will terminate in Θ(1) time.)Building a Heap: We can use Heapify to build a heap as follows. First we start with a heap in which theelements are not in heap order. They are just in the same order that they were given to us in the arrayA. We build the heap by starting at the leaf level and then invoke Heapify on each node. (Note: Wecannot start at the top of the tree. Why not?


View Full Document
Download Lecture Notes
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Lecture Notes and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Lecture Notes 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?