DOC PREVIEW
MIT 6 006 - Lecture Notes

This preview shows page 1-2 out of 6 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 6 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 6 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 6 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

6.006 Intro to Algorithms Recitation 04 February 11, 2011AVL TreesRecall the operations (e.g. find, insert, delete) of a binary search tree. The runtime ofthese operations were all O(h) where h represents the height of the tree, defined as the length ofthe longest branch. In the worst case, all the nodes of a tree could be on the same branch. In thiscase, h = n, so the runtime of these binary search tree operations are O(n). However, we canmaintain a much better upper bound on the height of the tree if we make efforts to balance thetree and even out the length of all branches. AVL trees are binary search trees that balances itselfevery time an element is inserted or deleted. Each node of an AVL tree has the property thatthe heights of the sub-tree rooted at its children differ by at most one.Upper Bound of AVL Tree HeightWe can show that an AVL tree with n nodes has O(log n) height. Let Nhrepresent the minimumnumber of nodes that can form an AVL tree of height h.If we know Nh−1and Nh−2, we can determine Nh. Since this Nh-noded tree must have a heighth, the root must have a child that has height h − 1. To minimize the total number of nodes in thistree, we would have this sub-tree contain Nh−1nodes. By the property of an AVL tree, if one childhas height h − 1, the minimum height of the other child is h − 2. By creating a tree with a rootwhose left sub-tree has Nh−1nodes and whose right sub-tree has Nh−2nodes, we have constructedthe AVL tree of height h with the least nodes possible. This AVL tree has a total of Nh−1+Nh−2+1nodes (Nh−1and Nh−2coming from the sub-trees at the children of the root, the 1 coming fromthe root itself).The base cases are N1= 1 and N2= 2. From here, we can iteratively construct Nhby usingthe fact that Nh= Nh−1+ Nh−2+ 1 that we figured out above.Using this formula, we can then reduce as such:Nh= Nh−1+ Nh−2+ 1 (1)Nh−1= Nh−2+ Nh−3+ 1 (2)Nh= (Nh−2+ Nh−3+ 1) + Nh−2+ 1 (3)Nh> 2Nh−2(4)Nh> 2h2(5)log Nh> log 2h2(6)2 log Nh> h (7)h = O(log Nh) (8)Showing that the height of an AVL tree is indeed O(log n).6.006 Intro to Algorithms Recitation 04 February 11, 2011AVL RotationWe’ve shown that if we can indeed keep the tree balanced, we can keep the height of the tree atO(log n), which speeds up the worst case runtime of the tree operations. The next step is to showhow to keep the tree balanced as we insert and delete nodes from the tree.Since we need to maintain the property that the height of the children must not differ by morethan 1 for every node, it would be useful if we could access a node’s height without needing toexamine the entire length of the branch that it’s on. Recall that for a binary search tree, eachnode contained a key, left, right, and a parent. AVL trees will also contain an additionalparameter, height to help us keep track of balance.There are two operations needed to help balance an AVL tree: a left rotation and a right rotation.Rotations simply re-arrange the nodes of a tree to shift around the heights while maintaining theorder of its elements. Making a rotation requires re-assigning left, right, and parent of afew nodes, but nothing more than that. Rotations are O(1) time operations.AVL InsertionNow delegating to recitation notes from fall of 2009:6.006 Intro to Algorithms Recitation 04 February 11, 2011When we insert a node into node n, we have three possible cases.1. Children of n have same height k. Inserting into either sub-tree will still result in a validAVL tree2. The left child of n is heavier than the right child. Inserting into the left child may imbalancethe AVL tree3. The right child of n is heavier than the left child. Inserting into the right child may imbalancethe AVL treeWhen the AVL tree gets imbalanced, we must make rotations in the tree to re-arrange the nodesso that the AVL tree becomes balanced once again. Note that adding a node into a k height treedoes not always turn it into a k + 1 height tree, since we could have inserted the node on a shorterbranch of that tree. However, for now, we are looking specifically at the situations where addinga node into a k height tree does turn it into a k + 1 height tree. Let’s examine the case where weinsert a node into a heavy right child.6.006 Intro to Algorithms Recitation 04 February 11, 2011There are two cases here that will imbalance the AVL tree. We will once again look at theproblem on a case by case basis.In the first case, B had height k − 1, C had height k − 1, and a node was inserted into C,making its current height k. We call a left rotation on n to make the y node the new root and6.006 Intro to Algorithms Recitation 04 February 11, 2011shifting the B sub-tree over to be n’s child. The order of the elements are preserved (In both trees,A < n < B < y < C), but after the rotation, we get a balanced tree.In the second case, B had height k − 1, C had height k − 1, and a node was inserted into B,making its current height k. In this case, no single rotation on a node will result in a balanced tree,but if we make a right rotation on y and then a left rotation on x, we end up with a happy AVL tree.If we insert a node into a heavy left child instead, the balancing solutions are flipped (i.e. rightrotations instead of left rotations and vice versa), but the same concepts apply.AVL insertions are binary search tree insertions plus at most two rotations. Since binary searchtree insertions take O(h) time, rotations are O(1) time, and AVL trees have h = O(log n), AVLinsertions take O(log n) time.There are only a finite number of ways to imbalance an AVL tree after insertion. AVL insertionis simply identifying whether or not the insertion will imbalance the tree, figuring out whatimbalance case it causes, and making the rotations to fix that particular case.AVL DeletionAVL deletion is not too different from insertion. The key difference here is that unlike insertion,which can only imbalance one node at a time, a single deletion of a node may imbalance several ofits ancestors. Thus, when we delete a node and cause an imbalance of that node’s parent, not onlydo we have to make the necessary rotation on the parent, but we have to traverse up the ancestryline, checking the balance, and possibly make some more rotations to fix the AVL tree.Fixing the AVL tree after a deletion may require making O(log n) more rotations, but sincerotations are O(1) operations, the additional rotations does not affect the overall O(log n) runtimeof


View Full Document

MIT 6 006 - Lecture Notes

Documents in this Course
Quiz 1

Quiz 1

7 pages

Quiz 2

Quiz 2

12 pages

Quiz 2

Quiz 2

9 pages

Quiz 1

Quiz 1

10 pages

Quiz 2

Quiz 2

11 pages

Quiz 1

Quiz 1

12 pages

Graphs

Graphs

27 pages

Load more
Download Lecture Notes
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Lecture Notes and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Lecture Notes 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?