Algorithm Cheat Sheet: Heapsort

Recall that heaps are introduced to help illustrate arrays. One important operation on arrays is sorting and heaps can also be leveraged for that purpose. We know that a heap provides us the element with the maximum (or minimum) value on the top. Repeating the heapify operation for the rest of elements would continue telling us the second largest value and so on. This is the basic idea of heapsort.

Heapsort

To sort the elements of an array in ascending order of their values, we can rely on a max heap. Heapsort can be done in-place over the original array.

  1. Given an array \(a\) with a size of \(n\), we first heapify it to a heap.
  2. With the heap, we have the element that has the largest value positioned in \(a[1]\). We then swap \(a[1]\) and \(a[n]\).
  3. Now the element with the largest value is positioned in \(a[n]\), which is correct for an array sorted ascendingly. We then apply a top-down heapify to \(a[1]\) up to \(a[n-1]\), which would end up with another heap for the range from \(a[1]\) to \(a[n-1]\).
  4. After Step 2 and 3, the array is separated into two partitions, where the former is unsorted which also forms a heap while the later is sorted. We recursively apply Step 2 and 3 to the unsorted partition of the array until the whole array becomes sorted.
// a[1] to a[n] is the array storing a heap with a maximum size of n. 
// count is the current size of the heap.
void HeapSort(int[] a, int count)
{
    HeapCreate(a, count);
    for (int i = n; i > 1; --i)
    {
        swap(a[1], a[i]);
        TopDownHeapifyElement(a, 1, i-1);
    }
}

Heapsort is not stable. The swap of the first and last elements (Step 2 above) may break the original order of elements. For example, given a heap created out of an array (via Step 1 above), suppose \(a[1]\) and \(a[2]\) have the identical value. Step 2 then swaps \(a[1]\) and \(a[n]\) and thus breaks the original order of the original \(a[1]\) and \(a[2]\). Such order break will last since the original element of \(a[1]\) will stay in the end of the array throughout the heapsort.

The cost of heapsort mainly comes from two operations, the heap creation and the top-down heapify. The heap creation can be done in linear time. A top-down heapify starts from the root and the run time depends on the height where the heapify ends. For example, given a heap with \(n\) nodes, the first (top) level has a height of \(\log n\) while the last (leaf) level has a height of zero. As the height decrements, the number of nodes at each level doubles. With that, the sum of every top-down heapify results in an \(O(n\log n)\) time complexity which is calculated as below. The time complexity of heapsort is thus \(O(n\log n)\).

$$ \begin{aligned} T & = ~~~2^{\log n}(\log n) + 2^{\log n - 1}(\log n - 1) + 2^{\log n - 2}(\log n - 2) + \cdots + 2^1(1) \\ & \\ {T\over2} & = ~~~~~~~~~~~~~~~~~~~~~~~~~~~ 2^{\log n - 1}(\log n) ~~~~~~~ + 2^{\log n - 2}(\log n - 1) + \cdots + 2^1(2) + 2^0 \\ & \\ -{T\over2} & = {T\over2} - T \\ & \\ & = - 2^{\log n}(\log n) + 2^{\log n - 1} ~~~~~~~~~~~~~~~~~~ + 2^{\log n - 2} ~~~~~~~~~~~~~~~~~~ + \cdots + 2^1 ~~~~~ + 2^0 \\ & = - 2^{\log n}(\log n) + {1 (1 - 2^{\log n}) \over (1-2)} \\ & = - n(\log n - 1) - 1 \\ & \\ T & = ~2n(\log n - 1) + 2 \end{aligned} $$

Heapsort vs. quicksort

Both heapsort and quicksort share the same high-level idea of partitioning an array into the unsorted and sorted parts. While the run time of quicksort depends on the pivot selection, ranging from \(O(n\log n)\) to \(O(n^2)\), heapsort can be done consistently in \(O(n\log n)\) time.

On the other hand, quicksort is likely to have better performance over heapsort. During the sorting, quicksort is always doing sequential scans within certain ranges, whereas a top-down heapify of heapsort traverses a path from the top node to some other node, which transforms to a scattered scan over the array. The performance of a sorting algorithm also depends on the initial status of an unsorted array. The first step of heapsort is to heapify the whole array, which is likely to change its initial status before sorting. Sometimes this would increase the chance and thus the volume of element swapping.

Contents