Slow Sort

SlowSort is an example of MultiplyAndSurrender - a worst possible sort algorithm:

We can decompose the problem of sorting n numbers in ascending order into

(1) finding the maximum of those numbers, and
(2) sorting the remaining ones.
Subproblem (1) can be further decomposed into
(1.1) find the maximum of the first n/2 elements,
(1.2) find the maximum of the remaining n/2 elements, and
(1.3) find the largest of those two maxima.
Finally, subproblems (1.1) and (1.2) can be solved by sorting the specified elements and taking the last element in the result. We have thus multiplied the original problem into three slightly simpler ones (sort the first half, sort the second half, sort all elements but one), plus some overhead processing. We continue doing this recursively until the lists have at most one element each, at which point we are forced to surrender.

From "Pessimal algorithms and the simplexity of computations. A. Broder and J. Stolfi. ACM SIGACT News vol. 16 no. 7, 49--53. Fall 1984 [satirical article]. [PS,6p,54kB] [bib]" http://www.dcc.unicamp.br/~stolfi/EXPORT/papers/by-tag/bro-sto-84-pes.ps.gz or http://citeseer.nj.nec.com/334813.html

In PythonLanguage:

def slowsort(A, i, j):
# This procedure sorts the subarray A[i]...A[j]
if i >= j:
return
m = (i+j)/2
slowsort(A, i, m)
slowsort(A, m+1, j)
if A[m] > A[j]:
A[m],A[j] = A[j],A[m]
slowsort(A, i, j-1)
Could someone render this in InterCal as well?

Not I, but here's an obfuscated PerlLanguage version, for the heck of it:

 sub slowsort {
   $#_ > 0 ?
   map {shift @$_, slowsort(@$_)} [
     map {@$_}
     sort {$a->[0] cmp $b->[0]}
     map {[slowsort(@_[@$_])]}
     [0..@_/2-1], [@_/2..$#_] ]
   : @_
 }
(Ok, so I was nice enough to pretty-print it, but this is a one-liner - notice no semi-colon.) I have tested this. It does not sort in-place like the Python version above (if I understand it correctly, and I don't know Python) but instead returns the sorted version of the input list of arguments. It finds the minimum element rather than the maximum, but still sorts in ascending order (by putting the minimum before the N-1 sized recursive result). The sort {} () is used to swap the two half-lists returned by the recursion, if necessary (so that the one starting off with the minimum element comes first). This serves rather nicely to underscore the perversity of it all, IMHO :) Anyway, PerlLanguage gurus should be able to figure out the rest. -- KarlKnechtel

Here's an implementation in Erlang, that closely follows the description of the SlowSort algorithm, and reimplements the max function (erlang already has one, using it wouldn't be half as fun).

-module(slowsort).
-export([sort/1]).

sort([]) -> []; sort(Xs) -> M = max(Xs), sort(lists:delete(M, Xs)) ++ [M].

ceil(X) -> T = erlang:trunc(X), case (X - T) of Neg when Neg < 0 -> T; Pos when Pos > 0 -> T + 1; _ -> T end.

max([A]) -> A; max(Xs) -> M1 = lists:last( sort( lists:sublist(Xs, 1, ceil(length(Xs)/2)))), M2 = lists:last( sort( lists:sublist(Xs, ceil(length(Xs)/2)+1, length(Xs)))), if M1 > M2 -> M1; true -> M2 end.


What about BogoSort? The BogoSort algorithm is:

  1. Randomly permute the list.
  2. Check to see if the list is sorted.
  3. If so, exit. Otherwise, go to step 1.

Takes O(n!) in the average case, unbounded time in the worst case. SlowSort is only O(n ^ (lg n)). Of course, SlowSort is O(n ^ (lg n)) in the best case, whereas BogoSort might well succeed on the first pass. SlowSort has the advantage of being deterministic, as well.

-- ScottJohnson

What you call BogoSort, I call ShuffleSort?. The thing about BogoSort or ShuffleSort? is that it violates some of the rules in Simplexity analysis. One of the rules is that you actually have to progress towards a goal. You can't just obviously waste time for example by putting delay loops. The SlowSort algorithm actually never makes a wrong move. Once it swaps two nodes the nodes will be in the correct order relative to each other and their order will not be reversed. This is what makes SlowSort so ingenious.

-- Dale King

This idea that there has to be progress, but minimal progress, reminds me of the Busy Beaver problem where, from my understanding, the goal is to find the length of the longest running, but yet eventually halting, program for a given Turing machine.

-- Chiao

...for a given Turing machine of given size. Actually, this is conceived as a function C(x): How many steps has the longest running but terminating turing machine fitting into size x given input consisting of only 1s. This function grows mindbogglingly fast (and there are not many values known; when I worked on that topic 20 years ago the highest x was 4) and is not computable on a turing machine (there is a proof by contradiction using just this function C). -- GunnarZarncke

My favorite thing about BogoSort is that it checks order on each iteration, yet it is still O(n!) for sorted inputs.

By randomly permuting before checking, BogoSort offers the same (nondeterministic) performance on all inputs. A valuable property. Of course, there's also this variant, which makes "progress" (of a sort) on each step:

Think of it as a rather slow BubbleSort. :)

The worst-case run time of BogoSort and this variant is infinite. Since some people define an algorithm as something that (among other things) is *guaranteed* to eventually finish ("halt"), those people would say that BogoSort (and this variant) is "not an algorithm".

But notice the parenthesised comment on the first step: "not the same ones, please...". If we record which pairs of elements have been tested, and make sure we never test them again, does that give a proper algorithm?

If we record which pairs of elements have been tested and make sure we never test them again, we can't do worse than O(n ^ 2) in the worst case, because that's how many pairs there are. If we record which positions in the array have been tested and make sure we never test them again, the algorithm will fail to finish on most inputs. So no. -- Bart Massey

That's an opportunity in disguise. Keep a list of which positions have been tested. Every time you swap a pair of elements, go through the list and update any occurrences of those positions. The entire sort is at least O(n ^ 4) average case.


SortingAlgorithms


CategoryAlgorithm


EditText of this page (last edited July 30, 2013) or FindPage with title or text search