CmSc 250 Fundamentals of Computing III
Sorting Algorithms: QuickSort
Learning Goals
Exam-like questions
Here we don't have the merge step, at the end all the elements are in the proper order.
STEP 1. Choosing the pivot
Choosing the pivot is an essential step.
Depending on the pivot the algorithm may run very fast, or in quadric time.:
This is a bad choice - the pivot may turn to be the smallest
or the largest element,
then one of the partitions will be empty.
Example:
8, 3, 25, 6, 10, 17, 1, 2, 18, 5
The first element is 8, the middle - 10, the last - 5.
The median of [8, 10, 5] is 8
STEP 2. Partitioning
Partitioning is illustrated on the above example.
1. The first action is to get the pivot out of the way - swap it with the last element
5, 3, 25, 6, 10, 17, 1, 2, 18, 8
2. We want larger elements to go to the right and smaller elements to go to the left.
Two "fingers" are used to scan the elements from left to right and from right to left:
[5, 3, 25, 6, 10, 17, 1, 2, 18, 8] ^ ^ i j
In the example the first swapping will be between 25 and 2, the second between 10 and 1.
3. Restore the pivot.
After restoring the pivot we obtain the following partitioning into three groups:
[5, 3, 2, 6, 1] [ 8 ] [10, 25, 18, 17]
STEP 3. Recursively quicksort the left and the right parts
Here is the code, that implements the partitioning.
left points to the first element
in the array currently processed, right
points to the last element.
if( left + 10 <= right) { int i = left, j = right - 1; for ( ; ; ) { while (a[++i] < pivot ) {} // move the left finger while (pivot < a[--j] ) {} // move the right finger if (i < j) swap (a[i],a[j]); // swap else break; // break if fingers have crossed } swap (a[I], a[right-1); // restore the pivot quicksort ( a, left, i-1); // call quicksort for the left part quicksort (a, i+1, right); // call quicksort for the left part } else insertionsort (a, left, right);
If the elements are less than 10, quicksort is not very
efficient.
Instead insertion sort is used at the last phase
of sorting.
Click here to see the above example worked out in details
Animations:
Compare the two versions:
A. while (a[++i] < pivot) {} while (pivot < a[--j]) {} if (i < j) swap (a[i], a[j]); else break; B. while (a[i] < pivot) {i++;} while (pivot < a[j] ) {j--;} if (i < j) swap (a[i], a[j]); else break;
If we have an array of equal elements, the second code will
never increment i or decrement j,
and will
do infinite swaps. i and j will never cross.
Worst-case: O(N^{2})
Best-case O(NlogN) The best case is when the pivot is
the median of the array,
and then the left and the
right part will have same size.
There are logN partitions, and to obtain each partitions
we do N comparisons
(and not more than N/2 swaps).
Hence the complexity is O(NlogN)
Average-case - O(NlogN)
Advantages:
Disadvantages: The worst-case complexity is O(N^{2})
Applications:
Commercial applications use Quicksort - generally it runs fast,
no additional memory,
this compensates for the
rare occasions when it runs with O(N^{2})
Never use in applications which require
guaranteed response time:
Comparison with mergesort:
So far, our best sorting algorithm has O(nlog n) performance: can we do any better?
In general, the answer is no.
Learning Goals
Exam-like questions
- Briefly describe the basic idea of quicksort.
- What is the complexity of quicksort?
- Compare quicksort with mergesort.
- What are the advantages and disadvantages of quicksort?
- Which applications are not suitable for quicksort and why?