Quim’s wild sorting algorithm
Exercise 2.10
Let T[i\dots j] be a table with n = j - i + 1 elements. Consider the following sorting algorithm known as Quim’s wild sorting algorithm:
If n \leq 2, sort the table directly.
If n \geq 3, “divide” the table into three intervals T[i\dots k - 1], T[k\dots \ell], and T[\ell + 1\dots j], where k = i + \lfloor n/3\rfloor and \ell = j - \lfloor n/3\rfloor.
The algorithm recursively calls itself to first sort T[i\dots \ell], then sort T[k\dots j], and finally sort T[i\dots \ell] again.
Prove that this algorithm is correct. Give a recurrence for its running time and solve it.
Writing the recurrence for the running time should be easier and using the Master Theorem it should not be difficult to solve. How good is the running time? Better or worse, say than Insertion Sort?
Proving that the algorithm is correct might be trickier, especially to write down as a formal argument. Don’t worry if you got stuck, but try to write down your argument anyway before class. If you find the i, j confusing think of i=0 and j=n-1.
Would the algorithm be correct if instead of sorting T[k..j] it sorted T[\ell..j]?