Lower Bounds & Sorting in Linear TimeComparison-based SortingDecision TreeDecision Tree – ExampleDecision Tree (Contd.)A Lower Bound for Worst CaseOptimal sorting for three elementsSlide 8Proof – Contd.Non-comparison Sorts: Counting SortSlide 11Counting-Sort (A, B, k)Algorithm AnalysisRadix SortAn ExampleRadix-Sort(A, d)Correctness of Radix SortSlide 18Bucket SortSlide 20Bucket-Sort (A)Correctness of BucketSortAnalysisAnalysis – Contd.Slide 25Slide 26Slide 27Slide 28Lower Bounds & Sorting in Linear TimeLower Bounds & Sorting in Linear TimeMany of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel HillComparison-based SortingComparison sort»Only comparison of pairs of elements may be used to gain order information about a sequence.»Hence, a lower bound on the number of comparisons will be a lower bound on the complexity of any comparison-based sorting algorithm.All our sorts have been comparison sortsThe best worst-case complexity so far is (n lg n) (merge sort and heapsort).We prove a lower bound of n lg n, (or (n lg n)) for any comparison sort, implying that merge sort and heapsort are optimal.Decision TreeBinary-tree abstraction for any comparison sort.Represents comparisons made by »a specific sorting algorithm»on inputs of a given size.Abstracts away everything else – control and data movement – counting only comparisons.Each internal node is annotated by i:j, which are indices of array elements from their original positions.Each leaf is annotated by a permutation (1), (2), …, (n) of orders that the algorithm determines.Decision Tree – ExampleFor insertion sort operating on three elements.1:22:3 1:31:3 2:31,2,31,3,23,1,22,1,32,3,13,2,1>>>>Contains 3! = 6 leaves.Decision Tree (Contd.)Execution of sorting algorithm corresponds to tracing a path from root to leaf.The tree models all possible execution traces.At each internal node, a comparison ai aj is made.»If ai aj, follow left subtree, else follow right subtree.»View the tree as if the algorithm splits in two at each node, based on information it has determined up to that point.When we come to a leaf, ordering a(1) a (2) … a (n) is established.A correct sorting algorithm must be able to produce any permutation of its input.»Hence, each of the n! permutations must appear at one or more of the leaves of the decision tree.A Lower Bound for Worst CaseWorst case no. of comparisons for a sorting algorithm is»Length of the longest path from root to any of the leaves in the decision tree for the algorithm.•Which is the height of its decision tree.A lower bound on the running time of any comparison sort is given by»A lower bound on the heights of all decision trees in which each permutation appears as a reachable leaf.Optimal sorting for three elementsAny sort of six elements has 5 internal nodes.1:22:3 1:31:3 2:31,2,31,3,23,1,22,1,32,3,13,2,1>>>>There must be a wost-case path of length ≥ 3.A Lower Bound for Worst CaseProof:From previous discussion, suffices to determine the height of a decision tree.h – height, l – no. of reachable leaves in a decision tree.In a decision tree for n elements, l n!. Why?In a binary tree of height h, no. of leaves l 2h. Prove it.Hence, n! l 2h.Theorem 8.1:Any comparison sort algorithm requires (n lg n) comparisons in the worst case.Theorem 8.1:Any comparison sort algorithm requires (n lg n) comparisons in the worst case.Proof – Contd.n! l 2h or 2h n!Taking logarithms, h lg(n!).n! > (n/e)n. (Stirling’s approximation, Eq. 3.19.)Hence, h lg(n!) lg(n/e)n = n lg n – n lg e = (n lg n)Non-comparison Sorts: Counting SortDepends on a key assumption: numbers to be sorted are integers in {0, 1, 2, …, k}.Input: A[1..n] , where A[j] {0, 1, 2, …, k} for j = 1, 2, …, n. Array A and values n and k are given as parameters.Output: B[1..n] sorted. B is assumed to be already allocated and is given as a parameter.Auxiliary Storage: C[0..k]Runs in linear time if k = O(n).Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display.Counting-Sort (A, B, k)CountingSort(A, B, k)1. for i 1 to k2. do C[i] 03. for j 1 to length[A]4. do C[A[j]] C[A[j]] + 15. for i 2 to k6. do C[i] C[i] + C[i –1] 7. for j length[A] downto 18. do B[C[A[ j ]]] A[j]9. C[A[j]] C[A[j]]–1CountingSort(A, B, k)1. for i 1 to k2. do C[i] 03. for j 1 to length[A]4. do C[A[j]] C[A[j]] + 15. for i 2 to k6. do C[i] C[i] + C[i –1] 7. for j length[A] downto 18. do B[C[A[ j ]]] A[j]9. C[A[j]] C[A[j]]–1O(k)O(k)O(n)O(n)Algorithm AnalysisThe overall time is O(n+k). When we have k=O(n), the worst case is O(n).»for-loop of lines 1-2 takes time O(k)»for-loop of lines 3-4 takes time O(n)»for-loop of lines 5-6 takes time O(k)»for-loop of lines 7-9 takes time O(n)Stable, but not in place.No comparisons made: it uses actual values of the elements to index into an array.Radix SortIt was used by the card-sorting machines.Card sorters worked on one column at a time.It is the algorithm for using the machine that extends the technique to multi-column sorting.The human operator was part of the algorithm!Key idea: sort on the “least significant digit” first and on the remaining digits in sequential order. The sorting method used to sort each digit must be “stable”.»If we start with the “most significant digit”, we’ll need extra storage.An Example392 631 928 356356 392 631 392446 532 532 446928 495 446 495631 356 356 532532 446 392 631495 928 495 928 InputAfter sortingon LSDAfter sortingon middle digitAfter sortingon MSDRadix-Sort(A, d)Correctness of Radix SortBy induction on the number of digits sorted.Assume that radix sort works for d – 1 digits. Show that it works for d digits. Radix sort of d digits radix sort of the low-order d – 1 digits followed by a sort on digit d . RadixSort(A, d)1. for i 1 to d2. do use a stable sort to sort array A on digit iRadixSort(A,
View Full Document