sorting - tohoku university official english website · 1/10/2003 algorithm theory hiroaki...
TRANSCRIPT
1/10/2003Algorithm TheoryHiroaki Kobayashi 1
Sorting
1/10/2003Algorithm TheoryHiroaki Kobayashi 2
Outline
Sorting with a priority queueSelection-sortInsertion-sortHeap SortQuick Sort
Merge SortLower Bound on Comparison-Based SortingBucket Sort and Radix Sort
1/10/2003Algorithm TheoryHiroaki Kobayashi 3
Sorting with a Priority Queue
We can use a priority queue to sort a set of comparable elements
Insert the elements one by one with a series of insertItem(e, e) operationsRemove the elements in sorted order with a series of removeMin() operations
The running time of this sorting method depends on the priority queue implementation
Algorithm PQ-Sort(S, C)Input sequence S, comparator Cfor the elements of SOutput sequence S sorted in increasing order according to CP ← priority queue with
comparator Cwhile ¬S.isEmpty ()
e ← S.remove (S. first ())P.insertItem(e, e)
while ¬P.isEmpty()e ← P.removeMin()S.insertLast(e)
1/10/2003Algorithm TheoryHiroaki Kobayashi 4
Sequence-based Priority QueueImplementation with an unsorted list
Performance:insertItem takes O(1) time since we can insert the item at the beginning or end of the sequenceremoveMin, minKey and minElement take O(n) time since we have to traverse the entire sequence to find the smallest key
Implementation with a sorted list
Performance:insertItem takes O(n) time since we have to find the place where to insert the itemremoveMin, minKey and minElement take O(1) time since the smallest key is at the beginning of the sequence
4 5 2 3 1 1 2 3 4 5
1/10/2003Algorithm TheoryHiroaki Kobayashi 5
Selection-SortSelection-sort is the variation of PQ-sort where the priority queue is implemented with an unsorted sequence
Running time of Selection-sort:Inserting the elements into the priority queue with ninsertItem operations takes O(n) timeRemoving the elements in sorted order from the priority queue with n removeMin operations takes time proportional to
O(n+(n-1)+…+2+1) = O(∑i) =O(n(n-1)/2)Selection-sort runs in O(n2) time
4 5 2 3 1
1/10/2003Algorithm TheoryHiroaki Kobayashi 6
Insertion-SortInsertion-sort is the variation of PQ-sort where the priority queue is implemented with a sorted sequence
Running time of Insertion-sort:Inserting the elements into the priority queue with ninsertItem operations takes time proportional to
O(1 + 2 + …+ n) =O(∑i) =O(n(n-1)/2)Removing the elements in sorted order from the priority queue with a series of n removeMin operations takes O(n) time
Insertion-sort runs in O(n2) time
1 2 3 4 5
1/10/2003Algorithm TheoryHiroaki Kobayashi 7
In-place Insertion-sortInstead of using an external data structure, we can implement selection-sort and insertion-sort in-placeA portion of the input sequence itself serves as the priority queueFor in-place insertion-sort
We keep sorted the initial portion of the sequenceWe can use swapElements instead of modifying the sequence
5 4 2 3 1
5 4 2 3 1
4 5 2 3 1
2 4 5 3 1
2 3 4 5 1
1 2 3 4 5
1 2 3 4 5
1/10/2003Algorithm TheoryHiroaki Kobayashi 8
Heap Sort
2
65
79
1/10/2003Algorithm TheoryHiroaki Kobayashi 9
What is a heapA heap is a binary tree storing keys at its internal nodes and satisfying the following properties:
Heap-Order: for every internal node v other than the root,key(v) ≥ key(parent(v))Complete Binary Tree: let hbe the height of the heap
for i = 0, … , h − 1, there are 2i nodes of depth iat depth h − 1, the internal nodes are to the left of the external nodes
2
65
79
The last node of a heap is the rightmost internal node of depth h − 1
last node
1/10/2003Algorithm TheoryHiroaki Kobayashi 10
Height of a HeapTheorem: A heap storing n keys has height O(log n)Proof: (we apply the complete binary tree property)
Let h be the height of a heap storing n keysSince there are 2i keys at depth i = 0, … , h − 2 and at least one key at depth h − 1, we have n ≥ 1 + 2 + 4 + … + 2h−2 + 1Thus, n ≥ 2h−1 , i.e., h ≤ log n + 1
1
2
2h−2
1
keys0
1
h−2
h−1
depth
1/10/2003Algorithm TheoryHiroaki Kobayashi 11
Heaps and Priority QueuesWe can use a heap to implement a priority queueWe store a (key, element) item at each internal nodeWe keep track of the position of the last nodeFor simplicity, we show only the keys in the pictures
(2, Sue)
(6, Mark)(5, Pat)
(9, Jeff) (7, Anna)
1/10/2003Algorithm TheoryHiroaki Kobayashi 12
Heap-Sort
Consider a priority queue with n items implemented by means of a heap
the space used is O(n)methods insertItem and removeMin take O(log n) timemethods size, isEmpty, minKey, and minElementtake time O(1) time
Using a heap-based priority queue, we can sort a sequence of nelements in O(n log n) timeThe resulting algorithm is called heap-sortHeap-sort is much faster than quadratic sorting algorithms, such as insertion-sort and selection-sort
1/10/2003Algorithm TheoryHiroaki Kobayashi 13
Insertion into a HeapMethod insertItem of the priority queue ADT corresponds to the insertion of a key k to the heapThe insertion algorithm consists of three steps
Find the insertion node z(the new last node)Store k at z and expand z into an internal nodeRestore the heap-order property (discussed next)
2
65
79
insertion node
2
65
79 1
z
z
1/10/2003Algorithm TheoryHiroaki Kobayashi 14
UpheapAfter the insertion of a new key k, the heap-order property may be violatedAlgorithm upheap restores the heap-order property by swapping kalong an upward path from the insertion nodeUpheap terminates when the key k reaches the root or a node whose parent has a key smaller than or equal to kSince a heap has height O(log n), upheap runs in O(log n) time
2
15
79 6z
1
25
79 6z
1/10/2003Algorithm TheoryHiroaki Kobayashi 15
Removal from a HeapMethod removeMin of the priority queue ADT corresponds to the removal of the root key from the heapThe removal algorithm consists of three steps
Replace the root key with the key of the last node wCompress w and its children into a leafRestore the heap-order property (discussed next)
2
65
79
last node
w
7
65
9w
1/10/2003Algorithm TheoryHiroaki Kobayashi 16
DownheapAfter replacing the root key with the key k of the last node, the heap-order property may be violatedAlgorithm downheap restores the heap-order property by swapping key k along a downward path from the rootUpheap terminates when key k reaches a leaf or a node whose children have keys greater than or equal to kSince a heap has height O(log n), downheap runs in O(log n) time
7
65
9w
5
67
9w
1/10/2003Algorithm TheoryHiroaki Kobayashi 17
Updating the Last NodeThe insertion node can be found by traversing a path of O(log n) nodes
Go up until a left child or the root is reachedIf a left child is reached, go to the right childGo down left until a leaf is reached
Similar algorithm for updating the last node after a removal
1/10/2003Algorithm TheoryHiroaki Kobayashi 18
Vector-based Heap Implementation
We can represent a heap with nkeys by means of a vector of length n + 1For the node at rank i
the left child is at rank 2ithe right child is at rank 2i + 1
Links between nodes are not explicitly storedThe leaves are not representedThe cell of at rank 0 is not usedOperation insertItem corresponds to inserting at rank n + 1Operation removeMin corresponds to removing at rank nYields in-place heap-sort
2
65
79
2 5 6 9 71 2 3 4 50
1/10/2003Algorithm TheoryHiroaki Kobayashi 19
Merging Two HeapsWe are given two two heaps and a key kWe create a new heap with the root node storing k and with the two heaps as subtreesWe perform downheap to restore the heap-order property
7
3
58
2
64
3
58
2
64
2
3
58
4
67
1/10/2003Algorithm TheoryHiroaki Kobayashi 20
We can construct a heap storing n given keys in using a bottom-up construction with log nphasesIn phase i, pairs of heaps with 2i −1 keys are merged into heaps with 2i+1−1 keys
Bottom-up Heap Construction
2i −1 2i −1
2i+1−1
1/10/2003Algorithm TheoryHiroaki Kobayashi 21
Example: Bottom-up construction of a heap with 15 keys
1516 124 76 2023
25
1516
5
124
11
76
27
2023
1-key heaps
3-key heaps
1/10/2003Algorithm TheoryHiroaki Kobayashi 22
Example (contd.)
25
1516
5
124
11
96
27
2023
15
2516
4
125
6
911
23
2027
3-key heaps
Down-heap bubbling
1/10/2003Algorithm TheoryHiroaki Kobayashi 23
Example (contd.)
7
15
2516
4
125
8
6
911
23
2027
4
15
2516
5
127
6
8
911
23
2027
7-key heaps
Down-heap bubbling
1/10/2003Algorithm TheoryHiroaki Kobayashi 24
Example (end)
4
15
2516
5
127
10
6
8
911
23
2027
5
15
2516
7
1210
4
6
8
911
23
2027
Final heap after down-heap bubbling
15-key heap
1/10/2003Algorithm TheoryHiroaki Kobayashi 25
AnalysisWe visualize the worst-case time of a downheap with a proxy path that goes first right and then repeatedly goes left until the bottom of the heap (this path may differ from the actual downheap path)Since each node is traversed by at most two proxy paths, the total number of nodes of the proxy paths is O(n)Thus, bottom-up heap construction runs in O(n) time Bottom-up heap construction is faster than n successive insertions and speeds up the first phase of heap-sort
1/10/2003Algorithm TheoryHiroaki Kobayashi 26
SummaryFor sorting n elements using heap sort
First phase of heap-sort: heap constructionO(n)
Second phase of heap-sort: n removeMinoperations
O(log n)
The heap-sort algorithm sorts a sequence S of n comparable elements in O(n log n) timeThe heap-sort algorithm sorts a sequence S of n comparable elements in O(n log n) time
1/10/2003Algorithm TheoryHiroaki Kobayashi 27
Quick-Sort
7 4 9 6 2 → 2 4 6 7 9
4 2 → 2 4 7 9 → 7 9
2 → 2 9 → 9
1/10/2003Algorithm TheoryHiroaki Kobayashi 28
Divide-and-Conquer StrategyDivide-and conquer is a general algorithm design paradigm:
Divide: divide the input data S in two or more disjoint subsets S1, S2, …Recur: solve the subproblems recursivelyConquer: combine the solutions for S1, S2, …, into a solution for S
The base case for the recursion are subproblems of constant sizeAnalysis can be done using recurrence equations
Divide phase
Conquer phase
1/10/2003Algorithm TheoryHiroaki Kobayashi 29
Quick-SortQuick-sort is a randomized sorting algorithm based on the divide-and-conquer paradigm:
Divide: pick a random element x (called pivot) and partition S into
L elements less than xE elements equal xG elements greater than x
Recur: sort L and GConquer: join L, E and G
x
x
L GE
x
1/10/2003Algorithm TheoryHiroaki Kobayashi 30
PartitionWe partition an input sequence as follows:
We remove, in turn, each element y from S and We insert y into L, E or G,depending on the result of the comparison with the pivot x
Each insertion and removal is at the beginning or at the end of a sequence, and hence takes O(1) timeThus, the partition step of quick-sort takes O(n) time
Algorithm partition(S, p)Input sequence S, position p of pivot Output subsequences L, E, G of the
elements of S less than, equal to,or greater than the pivot, resp.
L, E, G ← empty sequencesx ← S.remove(p)while ¬S.isEmpty()
y ← S.remove(S.first())if y < x
L.insertLast(y)else if y = x
E.insertLast(y)else { y > x }
G.insertLast(y)return L, E, G
1/10/2003Algorithm TheoryHiroaki Kobayashi 31
Quick-Sort TreeAn execution of quick-sort is depicted by a binary tree
Each node represents a recursive call of quick-sort and storesUnsorted sequence before the execution and its pivotSorted sequence at the end of the execution
The root is the initial call The leaves are calls on subsequences of size 0 or 1
7 4 9 6 2 → 2 4 6 7 9
4 2 → 2 4 7 9 → 7 9
2 → 2 9 → 9
1/10/2003Algorithm TheoryHiroaki Kobayashi 32
Execution ExamplePivot selection
7 2 9 4 → 2 4 7 9
2 → 2
7 2 9 4 3 7 6 1 → 1 2 3 4 6 7 8 9
3 8 6 1 → 1 3 8 6
3 → 3 8 → 89 4 → 4 9
9 → 9 4 → 4
1/10/2003Algorithm TheoryHiroaki Kobayashi 33
Execution Example (cont.)Partition, recursive call, pivot selection
2 4 3 1 → 2 4 7 9
9 4 → 4 9
9 → 9 4 → 4
7 2 9 4 3 7 6 1 → 1 2 3 4 6 7 8 9
3 8 6 1 → 1 3 8 6
3 → 3 8 → 82 → 2
1/10/2003Algorithm TheoryHiroaki Kobayashi 34
Execution Example (cont.)Partition, recursive call, base case
2 4 3 1 →→ 2 4 7
1 → 1 9 4 → 4 9
9 → 9 4 → 4
7 2 9 4 3 7 6 1 → → 1 2 3 4 6 7 8 9
3 8 6 1 → 1 3 8 6
3 → 3 8 → 8
1/10/2003Algorithm TheoryHiroaki Kobayashi 35
Execution Example (cont.)Recursive call, …, base case, join
3 8 6 1 → 1 3 8 6
3 → 3 8 → 8
7 2 9 4 3 7 6 1 → 1 2 3 4 6 7 8 9
2 4 3 1 → 1 2 3 4
1 → 1 4 3 → 3 4
9 → 9 4 → 4
1/10/2003Algorithm TheoryHiroaki Kobayashi 36
Execution Example (cont.)
Recursive call, pivot selection
7 9 7 1 → 1 3 8 6
8 → 8
7 2 9 4 3 7 6 1 → 1 2 3 4 6 7 8 9
2 4 3 1 → 1 2 3 4
1 → 1 4 3 → 3 4
9 → 9 4 → 4
9 → 9
1/10/2003Algorithm TheoryHiroaki Kobayashi 37
Execution Example (cont.)Partition, …, recursive call, base case
7 9 7 1 → 1 3 8 6
8 → 8
7 2 9 4 3 7 6 1 → 1 2 3 4 6 7 8 9
2 4 3 1 → 1 2 3 4
1 → 1 4 3 → 3 4
9 → 9 4 → 4
9 → 9
1/10/2003Algorithm TheoryHiroaki Kobayashi 38
Execution Example (cont.)Join, join
7 9 7 → 17 7 9
8 → 8
7 2 9 4 3 7 6 1 → 1 2 3 4 6 7 7 9
2 4 3 1 → 1 2 3 4
1 → 1 4 3 → 3 4
9 → 9 4 → 4
9 → 9
1/10/2003Algorithm TheoryHiroaki Kobayashi 39
Worst-case Running TimeThe worst case for quick-sort occurs when the pivot is the unique minimum or maximum elementOne of L and G has size n − 1 and the other has size 0The running time is proportional to the sum
n + (n − 1) + … + 2 + 1Thus, the worst-case running time of quick-sort is O(n2)
timedepth
1n − 1
……
n − 11
n0
…
1/10/2003Algorithm TheoryHiroaki Kobayashi 40
Expected Running TimeConsider a recursive call of quick-sort on a sequence of size s
Good call: the sizes of L and G are each less than 3s/4Bad call: one of L and G has size greater than 3s/4
A call is good with probability 1/21/2 of the possible pivots cause good calls:
7 9 7 1 → 1
7 2 9 4 3 7 6 1 9
2 4 3 1 7 2 9 4 3 7 61
7 2 9 4 3 7 6 1
Good call Bad call
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16
Good pivotsBad pivots Bad pivots
1/10/2003Algorithm TheoryHiroaki Kobayashi 41
Expected Running Time, Part 2Probabilistic Fact: The expected number of coin tosses required in order to get k heads is 2kFor a node of depth i, we expect
i/2 ancestors are good callsThe size of the input sequence for the current call is at most (3/4)i/2n
s(r)
s(a) s(b)
s(c) s(d) s(f)s(e)
time per levelexpected height
O(log n)
O(n)
O(n)
O(n)
total expected time: O(n log n)
Therefore, we haveFor a node of depth 2log4/3n, the expected input size is oneThe expected height of the quick-sort tree is O(log n)
The amount or work done at the nodes of the same depth is O(n)Thus, the expected running time of quick-sort is O(n log n)
1/10/2003Algorithm TheoryHiroaki Kobayashi 42
In-Place Quick-SortQuick-sort can be implemented to run in-placeIn the partition step, we use replace operations to rearrange the elements of the input sequence such that
the elements less than the pivot have rank less than hthe elements equal to the pivot have rank between h and kthe elements greater than the pivot have rank greater than k
The recursive calls considerelements with rank less than helements with rank greater than k
Algorithm inPlaceQuickSort(S, l, r)Input sequence S, ranks l and rOutput sequence S with the
elements of rank between l and rrearranged in increasing order
if l ≥ rreturn
i ← a random integer between l and rx ← S.elemAtRank(i)(h, k) ← inPlacePartition(x)inPlaceQuickSort(S, l, h − 1)inPlaceQuickSort(S, k + 1, r)
1/10/2003Algorithm TheoryHiroaki Kobayashi 43
In-Place PartitioningPerform the partition using two indices to split S into L and E U G (a similar method can split E U G into E and G).
Repeat until j and k cross:Scan j to the right until finding an element > x.Scan k to the left until finding an element < x.Swap elements at indices j and k
3 2 5 1 0 7 3 5 9 2 7 9 8 9 7 6 9
j k
(pivot = 6)
3 2 5 1 0 7 3 5 9 2 7 9 8 9 7 6 9
j k
1/10/2003Algorithm TheoryHiroaki Kobayashi 44
Merge Sort
7 2 9 4 → 2 4 7 9
7 2 → 2 7 9 4 → 4 9
7 → 7 2 → 2 9 → 9 4 → 4
1/10/2003Algorithm TheoryHiroaki Kobayashi 45
Divide-and-Conquer StrategyDivide-and conquer is a general algorithm design paradigm:
Divide: divide the input data S in two disjoint subsets S1and S2Recur: solve the subproblems associated with S1 and S2Conquer: combine the solutions for S1 and S2 into a solution for S
The base case for the recursion are subproblems of size 0 or 1
Merge-sort is a sorting algorithm based on the divide-and-conquer paradigm Like heap-sort
It uses a comparatorIt has O(n log n) running time
Unlike heap-sortIt does not use an auxiliary priority queueIt accesses data in a sequential manner (suitable to sort data on a disk)
1/10/2003Algorithm TheoryHiroaki Kobayashi 46
Merge-SortMerge-sort on an input sequence S with nelements consists of three steps:
Divide: partition S into two sequences S1 and S2of about n/2 elements eachRecur: recursively sort S1and S2
Conquer: merge S1 and S2 into a unique sorted sequence
Algorithm mergeSort(S, C)Input sequence S with n
elements, comparator COutput sequence S sorted
according to Cif S.size() > 1
(S1, S2) ← partition(S, n/2) mergeSort(S1, C)mergeSort(S2, C)S ← merge(S1, S2)
1/10/2003Algorithm TheoryHiroaki Kobayashi 47
Merging Two Sorted SequencesThe conquer step of merge-sort consists of merging two sorted sequences A and B into a sorted sequence S containing the union of the elements of A and BMerging two sorted sequences, each with n/2 elements and implemented by means of a doubly linked list, takes O(n) time
Algorithm merge(A, B)Input sequences A and B with
n/2 elements each Output sorted sequence of A ∪ B
S ← empty sequencewhile ¬A.isEmpty() ∧ ¬B.isEmpty()
if A.first().element() < B.first().element()S.insertLast(A.remove(A.first()))
elseS.insertLast(B.remove(B.first()))
while ¬A.isEmpty()S.insertLast(A.remove(A.first()))
while ¬B.isEmpty()S.insertLast(B.remove(B.first()))
return S
1/10/2003Algorithm TheoryHiroaki Kobayashi 48
Merge-Sort TreeAn execution of merge-sort is depicted by a binary tree
each node represents a recursive call of merge-sort and storesunsorted sequence before the execution and its partitionsorted sequence at the end of the execution
the root is the initial call the leaves are calls on subsequences of size 0 or 1
7 2 9 4 → 2 4 7 9
7 2 → 2 7 9 4 → 4 9
7 → 7 2 → 2 9 → 9 4 → 4
1/10/2003Algorithm TheoryHiroaki Kobayashi 49
Execution ExamplePartition
7 2 9 4 → 2 4 7 9 3 8 6 1 → 1 3 8 6
7 2 → 2 7 9 4 → 4 9 3 8 → 3 8 6 1 → 1 6
7 → 7 2 → 2 9 → 9 4 → 4 3 → 3 8 → 8 6 → 6 1 → 1
7 2 9 4 3 8 6 1 → 1 2 3 4 6 7 8 9
1/10/2003Algorithm TheoryHiroaki Kobayashi 50
Execution Example (cont.)Recursive call, partition
7 2 9 4 → 2 4 7 9 3 8 6 1 → 1 3 8 6
7 2 → 2 7 9 4 → 4 9 3 8 → 3 8 6 1 → 1 6
7 → 7 2 → 2 9 → 9 4 → 4 3 → 3 8 → 8 6 → 6 1 → 1
7 2 9 4 3 8 6 1 → 1 2 3 4 6 7 8 9
1/10/2003Algorithm TheoryHiroaki Kobayashi 51
Execution Example (cont.)Recursive call, partition
7 2 9 4 → 2 4 7 9 3 8 6 1 → 1 3 8 6
7 2 → 2 7 9 4 → 4 9 3 8 → 3 8 6 1 → 1 6
7 → 7 2 → 2 9 → 9 4 → 4 3 → 3 8 → 8 6 → 6 1 → 1
7 2 9 4 3 8 6 1 → 1 2 3 4 6 7 8 9
1/10/2003Algorithm TheoryHiroaki Kobayashi 52
Execution Example (cont.)Recursive call, base case
7 2 9 4 → 2 4 7 9 3 8 6 1 → 1 3 8 6
7 2 → 2 7 9 4 → 4 9 3 8 → 3 8 6 1 → 1 6
7 → 7 2 → 2 9 → 9 4 → 4 3 → 3 8 → 8 6 → 6 1 → 1
7 2 9 4 3 8 6 1 → 1 2 3 4 6 7 8 9
1/10/2003Algorithm TheoryHiroaki Kobayashi 53
Execution Example (cont.)
Recursive call, base case
7 2 9 4 → 2 4 7 9 3 8 6 1 → 1 3 8 6
7 2 → 2 7 9 4 → 4 9 3 8 → 3 8 6 1 → 1 6
7 → 7 2 → 2 9 → 9 4 → 4 3 → 3 8 → 8 6 → 6 1 → 1
7 2 9 4 3 8 6 1 → 1 2 3 4 6 7 8 9
1/10/2003Algorithm TheoryHiroaki Kobayashi 54
Execution Example (cont.)Merge
7 2 9 4 → 2 4 7 9 3 8 6 1 → 1 3 8 6
7 2 → 2 7 9 4 → 4 9 3 8 → 3 8 6 1 → 1 6
7 → 7 2 → 2 9 → 9 4 → 4 3 → 3 8 → 8 6 → 6 1 → 1
7 2 9 4 3 8 6 1 → 1 2 3 4 6 7 8 9
1/10/2003Algorithm TheoryHiroaki Kobayashi 55
Execution Example (cont.)Recursive call, …, base case, merge
7 2 9 4 → 2 4 7 9 3 8 6 1 → 1 3 8 6
7 2 → 2 7 9 4 → 4 9 3 8 → 3 8 6 1 → 1 6
7 → 7 2 → 2 3 → 3 8 → 8 6 → 6 1 → 1
7 2 9 4 3 8 6 1 → 1 2 3 4 6 7 8 9
9 → 9 4 → 4
1/10/2003Algorithm TheoryHiroaki Kobayashi 56
Execution Example (cont.)Merge
7 2 9 4 → 2 4 7 9 3 8 6 1 → 1 3 8 6
7 2 → 2 7 9 4 → 4 9 3 8 → 3 8 6 1 → 1 6
7 → 7 2 → 2 9 → 9 4 → 4 3 → 3 8 → 8 6 → 6 1 → 1
7 2 9 4 3 8 6 1 → 1 2 3 4 6 7 8 9
1/10/2003Algorithm TheoryHiroaki Kobayashi 57
Execution Example (cont.)
Recursive call, …, merge, merge
7 2 9 4 → 2 4 7 9 3 8 6 1 → 1 3 6 8
7 2 → 2 7 9 4 → 4 9 3 8 → 3 8 6 1 → 1 6
7 → 7 2 → 2 9 → 9 4 → 4 3 → 3 8 → 8 6 → 6 1 → 1
7 2 9 4 3 8 6 1 → 1 2 3 4 6 7 8 9
1/10/2003Algorithm TheoryHiroaki Kobayashi 58
Execution Example (cont.)Merge
7 2 9 4 → 2 4 7 9 3 8 6 1 → 1 3 6 8
7 2 → 2 7 9 4 → 4 9 3 8 → 3 8 6 1 → 1 6
7 → 7 2 → 2 9 → 9 4 → 4 3 → 3 8 → 8 6 → 6 1 → 1
7 2 9 4 3 8 6 1 → 1 2 3 4 6 7 8 9
1/10/2003Algorithm TheoryHiroaki Kobayashi 59
Analysis of Merge-SortThe height h of the merge-sort tree is O(log n)
at each recursive call we divide in half the sequence,
The overall amount or work done at the nodes of depth i is O(n)we partition and merge 2i sequences of size n/2i
we make 2i+1 recursive calls
Thus, the total running time of merge-sort is O(n log n)
size#seqsdepth
………
n/2i2ii
n/221
n10
1/10/2003Algorithm TheoryHiroaki Kobayashi 60
Summary of Sorting Algorithms
in-place, randomizedfastest (good for large inputs (1K
— 1M))
O(n log n)expectedquick-sort
sequential data accessfast (good for huge inputs (> 1M))
O(n log n)merge-sort
in-placefast (good for large inputs (1K —
1M))O(n log n)heap-sort
O(n2)
O(n2)
Time
insertion-sort
selection-sort
Algorithm Notes
in-placeslow (good for small inputs (< 1K))
in-placeslow (good for small inputs (< 1K))
1/10/2003Algorithm TheoryHiroaki Kobayashi 61
Sorting Lower Bound
1/10/2003Algorithm TheoryHiroaki Kobayashi 62
Comparison-Based SortingMany sorting algorithms are comparison based.
They sort by making comparisons between pairs of objectsExamples: bubble-sort, selection-sort, insertion-sort, heap-sort, merge-sort, quick-sort, ...
Let us therefore derive a lower bound on the running time of any algorithm that uses comparisons to sort n elements, x1, x2, …, xn.
Is xi < xj?
yes
no
1/10/2003Algorithm TheoryHiroaki Kobayashi 63
Counting ComparisonsLet us just count comparisons then.Each possible run of the algorithm corresponds to a root-to-leaf path in a decision tree
xi < xj ?
xa < xb ?
xm < xo ? xp < xq ?xe < xf ? xk < xl ?
xc < xd ?
1/10/2003Algorithm TheoryHiroaki Kobayashi 64
Decision Tree HeightThe height of this decision tree is a lower bound on the running timeEvery possible input permutation must lead to a separate leaf output.
If not, some input …4…5… would have same output ordering as …5…4…, which would be wrong.
Since there are n!=1*2*…*n leaves, the height is at least log (n!)minimum height (time)
log (n!)
xi < xj ?
xa < xb ?
xm < xo ? xp < xq ?xe < xf ? xk < xl ?
xc < xd ?
n!
1/10/2003Algorithm TheoryHiroaki Kobayashi 65
The Lower BoundAny comparison-based sorting algorithms takes at least log (n!) timeTherefore, any such algorithm takes time at least
That is, any comparison-based sorting algorithm must run in Ω(n log n) time.
).2/(log)2/(2
log)!(log2
nnnnn
=
≥
1/10/2003Algorithm TheoryHiroaki Kobayashi 66
Bucket-Sort and Radix-Sort
0 1 2 3 4 5 6 7 8 9B
1, c 7, d 7, g3, b3, a 7, e
∅ ∅ ∅ ∅ ∅ ∅ ∅
1/10/2003Algorithm TheoryHiroaki Kobayashi 67
Bucket-SortLet be S be a sequence of n(key, element) items with keys in the range [0, N − 1]Bucket-sort uses the keys as indices into an auxiliary array Bof sequences (buckets)Phase 1: Empty sequence S by
moving each item (k, o) into its bucket B[k]
Phase 2: For i = 0, …, N − 1, move the items of bucket B[i] to the end of sequence S
Analysis:Phase 1 takes O(n) timePhase 2 takes O(n + N) time
Bucket-sort takes O(n + N) time
Algorithm bucketSort(S, N)Input sequence S of (key, element)
items with keys in the range[0, N − 1]
Output sequence S sorted byincreasing keys
B ← array of N empty sequenceswhile ¬S.isEmpty()
f ← S.first()(k, o) ← S.remove(f)B[k].insertLast((k, o))
for i ← 0 to N − 1while ¬B[i].isEmpty()
f ← B[i].first()(k, o) ← B[i].remove(f)S.insertLast((k, o))
1/10/2003Algorithm TheoryHiroaki Kobayashi 68
ExampleKey range [0, 9]
7, d 1, c 3, a 7, g 3, b 7, e
1, c 3, a 3, b 7, d 7, g 7, e
Phase 1
Phase 20 1 2 3 4 5 6 7 8 9
B
1, c 7, d 7, g3, b3, a 7, e
∅ ∅ ∅ ∅ ∅ ∅ ∅
1/10/2003Algorithm TheoryHiroaki Kobayashi 69
Properties and ExtensionsKey-type Property
The keys are used as indices into an array and cannot be arbitrary objectsNo external comparator
Stable Sort PropertyThe relative order of any two items with the same key is preserved after the execution of the algorithm
ExtensionsInteger keys in the range [a, b]
Put item (k, o) into bucketB[k − a]
String keys from a set D of possible strings, where D has constant size (e.g., names of the 50 U.S. states)
Sort D and compute the rank r(k) of each string k of D in the sorted sequence Put item (k, o) into bucket B[r(k)]
1/10/2003Algorithm TheoryHiroaki Kobayashi 70
Lexicographic OrderA d-tuple is a sequence of d keys (k1, k2, …, kd), where key ki is said to be the i-th dimension of the tupleExample:
The Cartesian coordinates of a point in space are a 3-tuple
The lexicographic order of two d-tuples is recursively defined as follows
(x1, x2, …, xd) < (y1, y2, …, yd)⇔
x1 < y1 ∨ x1 = y1 ∧ (x2, …, xd) < (y2, …, yd)
I.e., the tuples are compared by the first dimension, then by the second dimension, etc.
1/10/2003Algorithm TheoryHiroaki Kobayashi 71
Lexicographic-SortLet Ci be the comparator that compares two tuples by their i-th dimensionLet stableSort(S, C) be a stable sorting algorithm that uses comparator CLexicographic-sort sorts a sequence of d-tuples in lexicographic order by executing d times algorithm stableSort, one per dimensionLexicographic-sort runs in O(dT(n)) time, where T(n) is the running time of stableSort
Algorithm lexicographicSort(S)Input sequence S of d-tuplesOutput sequence S sorted in
lexicographic order
for i ← d downto 1stableSort(S, Ci)
Example:(7,4,6) (5,1,5) (2,4,6) (2, 1, 4) (3, 2, 4)
(2, 1, 4) (3, 2, 4) (5,1,5) (7,4,6) (2,4,6)
(2, 1, 4) (5,1,5) (3, 2, 4) (7,4,6) (2,4,6)
(2, 1, 4) (2,4,6) (3, 2, 4) (5,1,5) (7,4,6)
1/10/2003Algorithm TheoryHiroaki Kobayashi 72
Radix-SortRadix-sort is a specialization of lexicographic-sort that uses bucket-sort as the stable sorting algorithm in each dimensionRadix-sort is applicable to tuples where the keys in each dimension i are integers in the range [0, N − 1]Radix-sort runs in time O(d( n + N))
Algorithm radixSort(S, N)Input sequence S of d-tuples such
that (0, …, 0) ≤ (x1, …, xd) and(x1, …, xd) ≤ (N − 1, …, N − 1)for each tuple (x1, …, xd) in S
Output sequence S sorted inlexicographic order
for i ← d downto 1bucketSort(S, N)
1/10/2003Algorithm TheoryHiroaki Kobayashi 73
Radix-Sort for Binary Numbers
Consider a sequence of nb-bit integers
x = xb − 1 … x1x0We represent each element as a b-tuple of integers in the range [0, 1] and apply radix-sort with N = 2This application of the radix-sort algorithm runs in O(bn) time For example, we can sort a sequence of 32-bit integers in linear time
Algorithm binaryRadixSort(S)Input sequence S of b-bit
integers Output sequence S sortedreplace each element x
of S with the item (0, x)for i ← 0 to b − 1
replace the key k of each item (k, x) of Swith bit xi of x
bucketSort(S, 2)
1/10/2003Algorithm TheoryHiroaki Kobayashi 74
ExampleSorting a sequence of 4-bit integers
1001
0010
1101
0001
1110
0010
1110
1001
1101
0001
1001
1101
0001
0010
1110
1001
0001
0010
1101
1110
0001
0010
1001
1101
1110
1/10/2003Algorithm TheoryHiroaki Kobayashi 75
Priority Queue ADTA priority queue stores a collection of itemsAn item is a pair(key, element)Main methods of the Priority Queue ADT
insertItem(k, o)inserts an item with key k and element oremoveMin()removes the item with smallest key and returns its element
Additional methodsminKey(k, o)returns, but does not remove, the smallest key of an itemminElement()returns, but does not remove, the element of an item with smallest keysize(), isEmpty()
Applications:Standby flyersAuctionsStock market
1/10/2003Algorithm TheoryHiroaki Kobayashi 76
Total Order Relation
Keys in a priority queue can be arbitrary objects on which an order is definedTwo distinct items in a priority queue can have the same key
Mathematical concept of total order relation ≤
Reflexive property:x ≤ xAntisymmetric property:x ≤ y ∧ y ≤ x ⇒ x = yTransitive property:x ≤ y ∧ y ≤ z ⇒ x ≤ z
1/10/2003Algorithm TheoryHiroaki Kobayashi 77
Comparator ADTA comparator encapsulates the action of comparing two objects according to a given total order relationA generic priority queue uses an auxiliary comparatorThe comparator is external to the keys being comparedWhen the priority queue needs to compare two keys, it uses its comparator
Methods of the Comparator ADT, all with Boolean return type
isLessThan(x, y)isLessThanOrEqualTo(x,y)isEqualTo(x,y)isGreaterThan(x, y)isGreaterThanOrEqualTo(x,y)isComparable(x)