ITKarma picture

This is the final article in a series about heap sorting. In previous lectures, we looked at a wide variety of heap structures showing excellent speed results. The question arises: what kind of heap is most effective when it comes to sorting? The answer is: the one we will look at today.
 EDISON Software - web-development
We at EDISON use the best development methodologies in our projects.

ITKarma picture
Our new completed project - "Resident Taxi" - in which we linked the aggregator-taxi website, CRM-system, into a single whole driver and car control unit, mobile applications for iOS and Android.

The fastest algorithms for fast customers ;-)
The unusual heaps that we examined earlier are, of course, fine, but the most efficient heap is standard, but with improved screening.

What is screening, why is it needed on the heap and how does it work - is described in the very first part of a series of articles.

The standard sifting in the classic sorting by a bunch works roughly "forehead" - an element from the root of the subtree is sent to the clipboard, elements from the branch, according to the results of comparison, go up. Everything is quite simple, but too many comparisons are obtained.

ITKarma picture

In the ascending screen, comparisons are saved due to the fact that parents almost do not compare with descendants, basically, only descendants are compared with each other. In a regular heapsort, the parent is compared with the descendants and the descendants are compared with each other - therefore, the comparisons are almost one and a half times more with the same number of exchanges.

So how it works, let's look at a specific example. Suppose we have an array in which a heap is already almost formed - all that remains is to sift the root. For all other nodes, the condition is fulfilled - any descendant is no more than its parent.

First of all, from the node for which sifting is performed, you need to go down, along large descendants. A bunch of binary - that is, we have a left descendant and a right descendant. We get down to that branch where the descendant is larger. At this stage, the main number of comparisons takes place - left/right descendants are compared with each other.

ITKarma picture

Having reached the sheet at the last level, we thereby decided on the branch in which the values ​​need to be shifted up. But you do not need to move the whole branch, but only the part that is larger than the root from which you started.

Therefore, we go up the branch to the nearest node, which is larger than the root.

ITKarma picture

The last step is to use the buffer variable to move the node values ​​up the branch.

ITKarma picture

That's it. The ascending sifter formed a sorting tree from the array, in which any parent is larger than its descendants.

Final Animation:

ITKarma picture

Implementation in Python 3.7

The basic sorting algorithm is no different from the regular heapsort:

# Основной алгоритм сортировки кучей def HeapSortBottomUp(data): # Формируем первоначальное сортирующее дерево # Для этого справа-налево перебираем элементы массива # (у которых есть потомки) и делаем для каждого из них просейку for start in range((len(data) - 2)//2, -1, -1): HeapSortBottomUp_Sift(data, start, len(data) - 1) # Первый элемент массива всегда соответствует корню сортирующего дерева # и поэтому является максимумом для неотсортированной части массива. for end in range(len(data) - 1, 0, -1): # Меняем этот максимум местами с последним # элементом неотсортированной части массива data[end], data[0]=data[0], data[end] # После обмена в корне сортирующего дерева немаксимальный элемент # Восстанавливаем сортирующее дерево # Просейка для неотсортированной части массива HeapSortBottomUp_Sift(data, 0, end - 1) return data 

Descent to the bottom sheet is conveniently/visually put in a separate function:

# Спуск вниз до самого нижнего листа # Выбираем бОльших потомков def HeapSortBottomUp_LeafSearch(data, start, end): current=start # Спускаемся вниз, определяя какой # потомок (левый или правый) больше while True: child=current * 2 + 1 # Левый потомок # Прерываем цикл, если правый вне массива if child + 1 > end: break # Идём туда, где потомок больше if data[child + 1] > data[child]: current=child + 1 else: current=child # Возможна ситуация, если левый потомок единственный child=current * 2 + 1 # Левый потомок if child <= end: current=child return current 

And most importantly - a glade, first going down, then emerging up:

# Восходящая просейка def HeapSortBottomUp_Sift(data, start, end): # По бОльшим потомкам спускаемся до самого нижнего уровня current=HeapSortBottomUp_LeafSearch(data, start, end) # Поднимаемся вверх, пока не встретим узел # больший или равный корню поддерева while data[start] > data[current]: current=(current - 1)//2 # Найденный узел запоминаем, # в этот узел кладём корень поддерева temp=data[current] data[current]=data[start] # всё что выше по ветке вплоть до корня # - сдвигаем на один уровень вверх while current > start: current=(current - 1)//2 temp, data[current]=data[current], temp 

C code was also found on the Internet,
/*----------------------------------------------------------------------*//* BOTTOM-UP HEAPSORT *//* Written by J. Teuhola <>; the original idea is *//* probably due to R.W. Floyd. Thereafter it has been used by many *//* authors, among others S. Carlsson and I. Wegener. Building the heap *//* bottom-up is also due to R. W. Floyd: Treesort 3 (Algorithm 245), *//* Communications of the ACM 7, p. 701, 1964. *//*----------------------------------------------------------------------*/#define element float/*-----------------------------------------------------------------------*//* The sift-up procedure sinks a hole from v[i] to leaf and then sifts *//* the original v[i] element from the leaf level up. This is the main *//* idea of bottom-up heapsort. *//*-----------------------------------------------------------------------*/static void siftup(v, i, n) element v[]; int i, n; { int j, start; element x; start=i; x=v[i]; j=i << 1;/* Leaf Search */while(j <= n) { if(j < n) if v[j] < v[j + 1]) j++; v[i]=v[j]; i=j; j=i << 1; }/* Siftup */j=i >> 1; while(j >= start) { if(v[j] < x) { v[i]=v[j]; i=j; j=i >> 1; } else break; } v[i]=x; }/* End of siftup *//*----------------------------------------------------------------------*//* The heapsort procedure; the original array is r[0..n-1], but here *//* it is shifted to vector v[1..n], for convenience. *//*----------------------------------------------------------------------*/void bottom_up_heapsort(r, n) element r[]; int n; { int k; element x; element *v; v=r - 1;/* The address shift *//* Build the heap bottom-up, using siftup. */for (k=n >> 1; k > 1; k--) siftup(v, k, n);/* The main loop of sorting follows. The root is swapped with the last *//* leaf after each sift-up. */for(k=n; k > 1; k--) { siftup(v, 1, k); x=v[k]; v[k]=v[1]; v[1]=x; } }/* End of bottom_up_heapsort */ 

Purely empirically - according to my measurements, upward sorting by a heap works 1.5 times faster than regular sorting by a heap.

According to some information (on the algorithm page on Wikipedia, in the PDFs in the "Links" section), BottomUp HeapSort on average outperforms even quick sorting - for fairly large arrays of over 16 thousand elements in size.


ITKarma pictureBottom-up heapsort

ITKarma pictureA Variant of Heapsort with Almost Optimal Number of Comparisons

ITKarma pictureBuilding Heaps Fast

ITKarma pictureA new variant of heapsort beating, on an average, quicksort (if n is not very small)

Series Articles:

Today's sorting has been added to the AlgoLab application, who uses it - update the excel file with macros.