Shell Sort vs. Heap Sort vs. Merge Sort vs. …
From time to time people ask the ageless question: Which sorting algorithm is the fastest? This question doesn't have an easy or unambiguous answer, however. The speed of sorting can depend quite heavily on the environment where the sorting is done, the type of items that are sorted and the distribution of these items.
For example, sorting a database which is so big that cannot fit into memory all at once is quite different from sorting an array of 100 integers. Not only will the implementation of the algorithm be quite different, naturally, but it may even be that the same algorithm which is fast in one case is slow in the other. Also sorting an array may be different from sorting a linked list, for example.
…
Results are available for:
1. Integers
2. Integers, high amount of repetition
3. Strings: Slow comparison, fast copying.
4. Arrays: Fast comparison, slow copying.
http://warp.povusers.org/SortComparison/
Analysis of different sorting algorithms
http://linux.wku.edu/~lamonml/algor/sort/sort.html
Sorting Algorithms explained with a few paragraphs (very good for a quick introduction)
http://betterexplained.com/articles/sorting-algorithms/
Sorting Algorithms (Big O Comparison)
http://en.wikipedia.org/wiki/Sorting_algorithm
Sorting and Searching Algorithms (good description of Big O)
http://epaperpress.com/sortsearch/download/sortsearch.pdf
We can place an upper-bound on the execution time of algorithms using O (big-oh) notation. An
algorithm that runs in O(n2) time indicates that execution time increases with the square of the
dataset size. For example, if we increase dataset size by a factor of ten, execution time will
increase by a factor of 100. A more precise explanation of big-oh follows.
…
Big-oh notation does not describe the exact time that an algorithm takes, but only indicates an
asymptotic upper bound on execution time within a constant factor. If an algorithm takes O(n2)
time, then execution time grows no worse than the square of the size of the list.
.............n lg n n7/6 n lg n n2 0 1 0 1 1 4 25 64 256 16 8 645 2,048 65,536 256 12 16,384 49,152 16,777,216 4,096 16 416,128 1,048,565 4,294,967,296 65,536 20 10,568,983 20,971,520 1,099,511,627,776 1,048,576 24 268,435,456 402,653,183 281,474,976,710,656 16,777,216 Table 1-1: Growth Rates
Table 1-1 illustrates growth rates for various functions. A growth rate of O(lg n) occurs for
algorithms similar to the binary search. The lg (logarithm, base 2) function increases by one when
n is doubled. Recall that we can search twice as many items with one more comparison in the
binary search. Thus the binary search is a O(lg n) algorithm.
If the values in Table 1-1 represented microseconds, then a O(n1.25) algorithm may take 10
microseconds to process 1,048,476 items, a O(lg n) algorithm 20 seconds, and a O(n2) algorithm
up to 12 days! In the following chapters a timing estimate for each algorithm, using big-O
notation, will be included. For a more formal derivation of these formulas you may wish to consult
the references.
Visual animation of various sorting algorithms
http://www.cs.ubc.ca/~harrison/Java/sorting-demo.html?donttaseme