Aaron Christopher Finney <af-list(a)wfi-inc.com> wrote:
One of my early CS classes had us graph results of
sorts and plot the
order of magnitude...quicksort and mergesort were fairly even, depending
on the "randomness" of the data, and were very close to n*logn on the
graph. There were a couple of other variants of recursive sorts as well,
but I've never really used anything but the basic quicksort in a real live
application...
Hi
I always wonder why they don't teach kn+c type sorts.
QuickSort and MergeSort are good for things in the
50 to 500 count someplace but even for large data
fields, the kn+c will win over a k(n*logn), even if
the k and the c are somewhat large. The two sorts that
I've always liked for large data sets are either
a radixed basket sort or a radixed distribution sort.
The radixed basket sort is one of the oldest ( ever
watch a card sorter? ). With the larger memories of
machines today, even large data fields can be recursively
sorted faster than QuickSort could do. This is because,
n*n > n*logn > n for large n's.
Both, basket and distribution, are of the form kn+c. The
c can be large so small numbers of items are not practical. In the
code I wrote, about 1 millisecond was the c ( the fixed over head),
so you can see that for something in the 100 integer range,
QuickSort would, most likely, have been faster.
Dwight