SortingInsertion Sort IInsertion Sort IIInsertion Sort: CodeInsertion Sort: AnalysisOptimality Analysis IOther AssumptionsShell SortSorting nonconsecutive subarraysDoing the 1-sortExample of shell sortDiminishing gapsSlide 13Increment sequenceSlide 15Slide 16Merge SortMergingSlide 19Slide 20Slide 21Slide 22Slide 23Slide 24Slide 25Slide 26Slide 27Slide 28Merging numerical exampleBinary Merge SortSlide 31Slide 32Slide 33QuicksortSlide 35Slide 36Slide 37Quicksort ExampleSlide 39Quicksort SplitSlide 41Quicksort PerformanceImprovements to QuicksortCounting SortSlide 45Slide 46Radix SortSlide 48SortingDr. Bernard Chen Ph.D.University of Central ArkansasInsertion Sort IThe list is assumed to be broken into a sorted portion and an unsorted portionKeys will be inserted from the unsorted portion into the sorted portion.SortedUnsortedInsertion Sort IIFor each new key, search backward through sorted keysMove keys until proper position is foundPlace key in proper positionMovedInsertion Sort: Codetemplate <class Comparable> void insertionSort( vector<Comparable> & a ) { for( int p = 1; p < a.size( ); p++ ) { Comparable tmp = a[ p ]; int j; for( j = p; j > 0 && tmp < a[ j - 1 ]; j-- ) a[ j ] = a[ j - 1 ]; a[ j ] = tmp; } } Fixed n-1 iterationsWorst case i-1 comparisonsMove current key to rightInsert the new key to its proper positionSearching for the proper position for the new keyMovedInsertion Sort: AnalysisWorst Case: Keys are in reverse orderDo i-1 comparisons for each new key, where i runs from 2 to n.Total Comparisons: 1+2+3+ … + n-1 in nnin11212( )ComparisonOptimality Analysis ITo discover an optimal algorithm we need to find an upper and lower asymptotic bound for a problem.An algorithm gives us an upper bound. The worst case for sorting cannot exceed (n2) because we have Insertion Sort that runs that fast.Lower bounds require mathematical arguments.Other AssumptionsThe only operation used for sorting the list is swapping two keys.Only adjacent keys can be swapped.This is true for Insertion Sort and Bubble Sort.Shell SortWith insertion sort, each time we insert an element, other elements get nudged one step closer to where they ought to beWhat if we could move elements a much longer distance each time?We could move each element:A long distanceA somewhat shorter distanceA shorter distance stillThis approach is what makes shellsort so much faster than insertion sortSorting nonconsecutive subarraysConsider just the red locationsSuppose we do an insertion sort on just these numbers, as if they were the only ones in the array?Now consider just the yellow locationsWe do an insertion sort on just these numbersNow do the same for each additional group of numbersThe resultant array is sorted within groups, but not overall• Here is an array to be sorted (numbers aren’t important)Doing the 1-sortIn the previous slide, we compared numbers that were spaced every 5 locationsThis is a 5-sortOrdinary insertion sort is just like this, only the numbers are spaced 1 apartWe can think of this as a 1-sortSuppose, after doing the 5-sort, we do a 1-sort?In general, we would expect that each insertion would involve moving fewer numbers out of the wayThe array would end up completely sortedExample of shell sortoriginal81 94 11 96 12 35 17 95 28 58 41 75 155-sort35 17 11 28 12 41 75 15 96 58 81 94 953-sort28 12 11 35 15 41 58 17 94 75 81 96 951-sort11 12 15 17 28 35 41 58 75 81 94 95 96Diminishing gapsFor a large array, we don’t want to do a 5-sort; we want to do an N-sort, where N depends on the size of the arrayN is called the gap size, or interval sizeDiminishing gapsWe may want to do several stages, reducing the gap size each timeFor example, on a 1000-element array, we may want to do a 364-sort, then a 121-sort, then a 40-sort, then a 13-sort, then a 4-sort, then a 1-sortWhy these numbers?Increment sequenceNo one knows the optimal sequence of diminishing gapsThis sequence is attributed to Donald E. Knuth:Start with h = 1Repeatedly compute h = 3*h + 11, 4, 13, 40, 121, 364, 1093This sequence seems to work very wellIncrement sequenceAnother increment sequence mentioned in the textbook is based on the following formula:start with h = the half of the container’s sizehi = floor (hi-1 / 2.2)It turns out that just cutting the array size in half each time does not work out as wellAnalysis What is the real running time of shellsort?Nobody knows!Experiments suggest something like O(n3/2) or O(n7/6)Analysis isn’t always easy!Merge SortIf List has only one Element, do nothingOtherwise, Split List in HalfRecursively Sort Both ListsMerge Sorted Listsauxiliary arraysmallest smallestA G L O R H I M S T MergingMerge.Keep track of smallest element in each sorted half.Insert smallest of two elements into auxiliary array.Repeat until done.Aauxiliary arraysmallest smallestA G L O R H I M S TA GMergingMergingMerge.Merge.Keep track of smallest element in each sorted Keep track of smallest element in each sorted half.half.Insert smallest of two elements into auxiliary Insert smallest of two elements into auxiliary array.array.Repeat until done.Repeat until done.auxiliary arraysmallest smallestA G L O R H I M S TA G MergingMerge.Keep track of smallest element in each sorted half.Insert smallest of two elements into auxiliary array.Repeat until done.Hauxiliary arraysmallest smallestA G L O R H I M S TA G H MergingMerge.Keep track of smallest element in each sorted half.Insert smallest of two elements into auxiliary array.Repeat until done.Iauxiliary arraysmallest smallestA G L O R H I M S TA G H I MergingMerge.Keep track of smallest element in each sorted half.Insert smallest of two elements into auxiliary array.Repeat until done.Lauxiliary arraysmallest smallestA G L O R H I M S TA G H I L MergingMerge.Keep track of smallest element in each sorted half.Insert smallest of two elements into auxiliary array.Repeat until done.Mauxiliary arraysmallest smallestA G L O R H I M S TA G H I L M MergingMerge.Keep track of smallest element in each sorted half.Insert smallest of two elements into auxiliary array.Repeat
View Full Document