CSC 378: Data Structures and Algorithm Analysis

Leftist Trees

[L&D pg 304-307, Weiss 6.6]

Mergeable Heaps

A "mergeable heap" is an ADT that stores an element from an ordered set, and supports the operations Insert, Min, and Extract-Min (just like priority queues). Mergeable heaps support an additional operation:

index1.gif

Merge() takes two mergeable heaps and returns one that contains all of the elements. Duplicates are preserved.

Two implementations of the mergeable heap are the "leftist tree" and the "binomial heap". We'll talk about leftist trees in this lecture.

Leftist Tree Definition

A "leftist tree" is an implementation of a mergeable heap.

In a binary tree, define an external node to be one with fewer than two children.

Define dist( i ) to be the number of edges on the shortest path from node i to a descendant external node.

A leftist tree is a binary tree with properties

  1. key( i ) index2.gif key( parent( i ) )

    The root contains the minimum key. As with array heaps, we could have the maximum at the top, simply by changing this property and modifying the code accordingly.
  2. dist( right( i ) ) index3.gif dist( left( i ) )

    The shortest path to a descendant external node is through the right child.

Here's a leftist tree:

Note that

Leftist Tree Operations

The operations supported by a leftist tree are:

Suppose that we already implemented the Merge procedure (below, we'll discuss the implementation). Merge can be used to make very simple Insert and DeleteMin procedures:

Merging Leftist Trees

The Merge procedure takes two leftist trees, A and B, and returns a leftist tree that contains the union of the elements of A and B. In a program, a leftist tree is represented by a pointer to its root.

In merging, the simplest case occurs when one tree is empty (i.e. the pointer to the root is NULL). In this case, just return the other.

Otherwise, what is the root of the merged tree?

It's A's root if A has the smaller key; it's B's root if B has the smaller key.

Suppose A's root has the smaller key. (If this isn't the case, simply swap the A and B pointers.) Then we can merge right(A) with B:

index4.gif

Now right(A) has changed. Its dist() may have grown larger in the process, violating the second property above. If so, just swap right(A) and left(A):

if dist(right(A)) > dist(left(A)), swap right(A) and left(A).

Finally, since dist(right(A)) might have changed, we have to update dist(A):

dist(A) index5.gif 1 + dist(right(A))

Of course, if right(A) = NULL, we'll assume that dist(right(A)) = -1.

dist(right(A)) = -1

So the whole Merge procedure looks like this:

Note: If a node's child is NULL, that child's dist is assumed to be -1. This isn't always checked in the code above!

Analysis of Merge

Since each recursive call goes down either A's right path, index6.gif , or B's right path, index7.gif , the number of calls is at most the sum of the lengths of the right paths: index8.gif and

index9.gif

since other operations in the body of Merge are index10.gif . In other words,

index11.gif ,

since index12.gif . But what is index13.gif in a tree of index14.gif nodes? Let's look at some examples:
if dist(A) = 0, index15.gif .
if dist(A) = 1, index16.gif .
if dist(A) = 2, index17.gif .

So the pattern is:

index18.gif

So if index19.gif . This makes sense, since the tree must be full down to the dist(A) level (otherwise, the second property would be violated!).

A full tree of dist(A) levels has index20.gif nodes, so a general leftist tree of index21.gif nodes (which might have nodes dangling below the full levels) has:

index22.gif

Since index23.gif . If index24.gif , then

index25.gif

As a consequence, the running times of Insert and DeleteMin are also in index26.gif .

`Merge' in Action




Back to the lecture index

Copyright 1998 James Stewart