A "heap" is a binary tree which has the property that each child is "smaller" (for an appropriate definition of "smaller"; this generally requires that the data type be totally ordered) than the parent. (Larger can be used instead). Often implemented as an array, where the children of the element at position n are in positions 2n+1 and 2n+2.
The first element (minimum or maximum, depending on chosen order) can be found in O(1); each successor can be found in O(log n). The obvious algorithm for building the heap requires O(n log n) operation.
Used in the HeapSort algorithm. Also can be used to implement a PriorityQueue.
Surprisingly, it is claimed that the heap can be built in only O(n) operations. Cormen/Leiserson/Rivest, section 7.3, "Building a heap", has the following algorithm:
build-heap(A): heap-size[A] <- length[A] for i from length[A]/2 downto 1: heapify(A,i)which uses a routine heapify (precondition: the subtrees rooted at the children of node i are heap-ordered; postcondition: the subtree rooted at node i itself is heap-ordered) described in the previous section. They prove that heapify takes time at most proportional to the height of the node it's called on, and use that to deduce that build-heap takes linear time. All looks fine to me.
Any chance of showing us the routine heapify?
In my code, the above snippet is called "heapify" and it calls "pushdown":
pushdown(A,i): iChild <- 2*i if heap-size[A] < iChild return if heap-size[A] > iChild and A[iChild] < A[iChild+1] iChild <- iChild+1 if A[i] < A[iChild] swap(A, i, iChild) pushdown(A, iChild)Popping an item off the heap is then:
pop(A): top <- A[1] A[1] <- A[heap-size[A]] decrement heap-size[A] pushdown(A,1) return top
Not to be confused with the dynamic memory pool, often known as TheHeap.