Computer scientists have expended a lot of effort in trying to find a structure that preserves the good properties of the classical binary search tree—principally, the logarithmic efficiency of the dictionary operations and having the set’s ele-ments sorted—but avoids its worst-case degeneracy.

**Balanced Search
Trees**

In Sections 1.4, 4.5, and 5.3, we discussed the
binary search tree—one of the prin-cipal data structures for implementing
dictionaries. It is a binary tree whose nodes contain elements of a set of
orderable items, one element per node, so that all ele-ments in the left
subtree are smaller than the element in the subtree’s root, and all the
elements in the right subtree are greater than it. Note that this
transformation from a set to a binary search tree is an example of the
representation-change tech-nique. What do we gain by such transformation
compared to the straightforward implementation of a dictionary by, say, an
array? We gain in the time efficiency of searching, insertion, and deletion,
which are all in ** (**log

Computer scientists have expended a lot of
effort in trying to find a structure that preserves the good properties of the
classical binary search tree—principally, the logarithmic efficiency of the
dictionary operations and having the set’s ele-ments sorted—but avoids its
worst-case degeneracy. They have come up with two approaches.

The first approach is of the
instance-simplification variety: an unbalanced binary search tree is
transformed into a balanced one. Because of this, such trees are called ** self-balancing**.
Specific implementations of this idea differ by their definition of balance. An

The second approach is of the
representation-change variety: allow more than one element in a node of a
search tree. Specific cases of such trees are *2-3*** trees**,

**AVL Trees**

AVL trees were invented in 1962 by two Russian
scientists, G. M. Adelson-Velsky and E. M. Landis [Ade62], after whom this data
structure is named.

**DEFINITION **An** ***AVL tree*** **is a binary search tree in which the** ***balance factor*** **of** **every node, which is defined as the difference between the heights
of the node’s left and right subtrees, is either 0 or +1 or −1. (The height of the empty tree is defined as −1. Of course, the balance factor can also be computed as the
difference between the numbers of levels rather than the height difference of
the node’s left and right subtrees.)

For example, the binary search tree in Figure
6.2a is an AVL tree but the one in Figure 6.2b is not.

If an insertion of a new node makes an AVL tree
unbalanced, we transform the tree by a rotation. A ** rotation** in an AVL tree
is a local transformation of its subtree rooted at a node whose balance has
become either +2 or −2. If there are several such nodes, we rotate
the tree rooted at the unbalanced node that is the closest to the newly
inserted leaf. There are only four types of rotations; in fact, two of them are
mirror images of the other two. In their simplest form, the four rotations are
shown in Figure 6.3.

The first rotation type is called the ** single
right rotation**, or

The symmetric ** single left rotation**, or

The second rotation type is called the ** double
left-right rotation** (

The ** double right-left rotation** (

** LR**-rotation and is left for the exercises.

Note that the rotations are not trivial
transformations, though fortunately they can be done in constant time. Not only
should they guarantee that a resulting tree is balanced, but they should also
preserve the basic requirements of a binary search tree. For example, in the
initial tree of Figure 6.4, all the keys of subtree *T*_{1} are smaller than ** c,** which is smaller than all the keys of subtree

An example of constructing an AVL tree for a
given list of numbers is shown in Figure 6.6. As you trace the algorithm’s
operations, keep in mind that if there are several nodes with the ±2 balance, the rotation is done for the tree rooted at the
unbalanced node that is the closest to the newly inserted leaf.

How efficient are AVL trees? As with any search
tree, the critical charac-teristic is the tree’s height. It turns out that it
is bounded both above and below

by logarithmic functions. Specifically, the
height ** h** of any AVL tree with

log_{2} ** n** ≤

(These weird-looking constants are round-offs
of some irrational numbers related to Fibonacci numbers and the golden
ratio—see Section 2.5.)

The inequalities immediately imply that the
operations of searching and in-sertion are ** (**log

The operation of key deletion in an AVL tree is
considerably more difficult than insertion, but fortunately it turns out to be
in the same efficiency class as insertion, i.e., logarithmic.

These impressive efficiency characteristics
come at a price, however. The drawbacks of AVL trees are frequent rotations and
the need to maintain bal-ances for its nodes. These drawbacks have prevented
AVL trees from becoming the standard structure for implementing dictionaries.
At the same time, their un-derlying idea—that of rebalancing a binary search
tree via rotations—has proved to be very fruitful and has led to discoveries of
other interesting variations of the classical binary search tree.

**2-3 Trees**

As mentioned at the beginning of this section,
the second idea of balancing a search tree is to allow more than one key in the
same node of such a tree. The simplest implementation of this idea is 2-3
trees, introduced by the U.S. computer scientist John Hopcroft in 1970. A ** 2-3
tree** is a tree that can have nodes of two kinds: 2-nodes and 3-nodes. A

The last requirement of the 2-3 tree is that
all its leaves must be on the same level. In other words, a 2-3 tree is always
perfectly height-balanced: the length of a path from the root to a leaf is the
same for every leaf. It is this property that we “buy” by allowing more than
one key in the same node of a search tree.

Searching for a given key ** K** in a 2-3 tree is quite straightforward. We start at the root. If
the root is a 2-node, we act as if it were a binary search tree: we either stop
if

subtree if ** K** is, respectively, smaller or larger than the
root’s key. If the root is a 3-node, we know after no more than two key
comparisons whether the search can be stopped (if

Inserting a new key in a 2-3 tree is done as
follows. First of all, we always insert a new key ** K** in a leaf, except for the empty tree. The
appropriate leaf is found by performing a search for

An example of a 2-3 tree construction is given
in Figure 6.8.

As for any search tree, the efficiency of the
dictionary operations depends on the tree’s height. So let us first find an
upper bound for it. A 2-3 tree of height ** h** with the smallest number of keys is a full
tree of 2-nodes (such as the final tree in Figure 6.8 for

On the other hand, a 2-3 tree of height ** h** with the largest number of keys is a full tree of 3-nodes, each
with two keys and three children. Therefore, for any 2-3 tree with

** n **≤

and hence

** h **≥

These lower and upper bounds on height *h,*

log_{3}** (n** + 1

imply that the time efficiencies of searching,
insertion, and deletion are all in ** (**log

**1. **Which of the following binary trees are AVL
trees?

**a. **For** ***n*** **=** **1*,*** **2*,*** **3*,*** **4*,*** **and 5, draw all the binary trees with** ***n*** **nodes that satisfy** **the balance requirement of AVL trees.

**
**Draw a
binary tree of height 4 that can be an AVL tree and has the smallest number of
nodes among all such trees.

**
**Draw
diagrams of the single *L*-rotation and
of the double *RL*-rotation in their
general form.

**
**For each
of the following lists, construct an AVL tree by inserting their ele-ments
successively, starting with the empty tree.

**
**1, 2, 3,
4, 5, 6

**
**6, 5, 4,
3, 2, 1

**
**3, 6, 5,
1, 2, 4

**
****a. **For an AVL tree containing real numbers, design an algorithm for comput-ing
the range (i.e., the difference between the largest and smallest numbers in the
tree) and determine its worst-case efficiency.

**
**True or
false: The smallest and the largest keys in an AVL tree can always be found on
either the last level or the next-to-last level?

**
**Write a
program for constructing an AVL tree for a given list of ** n** distinct integers.

**
****a. **Construct a 2-3 tree for the list C, O, M, P, U, T, I, N, G. Use
the alphabetical** **order of the
letters and insert them successively starting with the empty tree.

**
**Assuming
that the probabilities of searching for each of the keys (i.e., the letters)
are the same, find the largest number and the average number of key comparisons
for successful searches in this tree.

**
**Let ** T_{B}** and

**
**For a
2-3 tree containing real numbers, design an algorithm for computing the range
(i.e., the difference between the largest and smallest numbers in the tree) and
determine its worst-case efficiency.

**
**Write a
program for constructing a 2-3 tree for a given list of ** n** integers.

Study Material, Lecturing Notes, Assignment, Reference, Wiki description explanation, brief detail

Introduction to the Design and Analysis of Algorithms : Transform and Conquer : Balanced Search Trees: AVL Trees and 2-3 Trees |

**Related Topics **

Privacy Policy, Terms and Conditions, DMCA Policy and Compliant

Copyright © 2018-2024 BrainKart.com; All Rights Reserved. Developed by Therithal info, Chennai.