# Aggregate method

The aggregate method (also aggregation method or holistic method ) is a procedure of amortized (runtime) analysis . The aggregate method tries to determine the average costs of a single operation by first determining the total costs of all operations and then dividing them by the number of operations.

## Examples

### Binary counter

The aggregate method is carried out using the example of a binary counter , the only possible operation of which is an incrementation .

The worst case for a binary counter with k bits occurs when all k bits have to be flipped during an increment . Let the costs for a bit change be 1. Then, according to the worst-case estimate, costs of nk would arise for n operations . However, this estimate is too pessimistic. The amortized analysis tries to achieve a more realistic and less pessimistic estimate of the costs.

Let us consider the number of bit changes for a 4-bit counter:

counter Number of bit changes
0000 -
0001 1
0010 2
0011 1
0100 3
0101 1
0110 2
0111 1
1000 4th
...

If you look at the sequence of bit changes, it is noticeable that the lowest bit changes with every increment, the next higher bit changes with every second, the next higher bit changes again with every fourth etc. This results in the following sum of bit changes with n incrementations:

${\ displaystyle n + \ left \ lfloor {\ frac {n} {2}} \ right \ rfloor + \ left \ lfloor {\ frac {n} {2 ^ {2}}} \ right \ rfloor + \ left \ lfloor {\ frac {n} {2 ^ {3}}} \ right \ rfloor + \ cdots + \ left \ lfloor {\ frac {n} {2 ^ {k}}} \ right \ rfloor \ leq n \ sum _ {i = 0} ^ {k} {\ frac {1} {2 ^ {i}}}}$ We can estimate this sum upwards:

${\ displaystyle n \ sum _ {i = 0} ^ {k} {\ frac {1} {2 ^ {i}}} \ leq n \ sum _ {i = 0} ^ {\ infty} {\ frac { 1} {2 ^ {i}}}}$ The sum of this infinite series is well known and is 2. From this it follows:

${\ displaystyle n \ sum _ {i = 0} ^ {k} {\ frac {1} {2 ^ {i}}} \ leq 2n}$ If we now consider the amortized costs for a single operation of the total of n operations, by dividing the total costs already determined by the number n of operations, we get: ${\ displaystyle a_ {i}}$ ${\ displaystyle Op_ {i}}$ ${\ displaystyle a_ {i} \ leq {\ frac {2n} {n}} = 2}$ This means that the amortized costs for an operation are at most 2 and thus in O (1), regardless of how many bits the counter has in total.

### dictionary

Binary search trees are an extremely common type of data structure. They solve, for example, the “dictionary” problem (see binary search tree # Motivation ), namely those who balanced the most important operations in the worst case in logarithmic time. A statement about amortized runtime behavior can be found in the corresponding article.

Here'm a data structure called amortized dictionary data structure (English amortized dictionary data structure ), presented their amortisiertes runtime behavior Found in O (log 2 n ) and for inserting in O (log n ) is.

The number n of entries in the binary representation is:

${\ displaystyle n =: \ sum _ {i = 0} ^ {k} \ lambda _ {i} 2 ^ {i}}$ With ${\ displaystyle \ lambda _ {i} \ in \ {0,1 \}}$ The data structure then consists of k +1 sorted sequences, which are either completely empty (λ i = 0) or completely full (λ i = 1). The individual elements of the data structure are distributed arbitrarily among these sequences.

Example: Let n = 11 (then 11 = 1 + 2 + 8 and k = 3). The elements are C , D , E , F , H , J , M , P , S , W and Y , which are distributed over the data structure as follows:

 Λ 0 : [ E ] λ 0 = 1 Λ 1 : [ D , H ] λ 1 = 1 Λ 2 : empty λ 2 = 0 Λ 3 : [ C , F , J , M , P , S , W , Y ] λ 3 = 1

A search operation is carried out by a binary search in each sequence Λ i , plus a logical combination, so that in the worst case the runtime behavior

${\ displaystyle \ sum _ {i = 0} ^ {k} \ lambda _ {i} \ lceil \ log (2 ^ {i} +1) \ rceil + k + 1 = \ sum _ {i = 0} ^ {k} \ lambda _ {i} (i + 1) + k + 1}$ = O (log 2 n )

results.

An insertion uses mergesort , the effort of which is given by the sum of the two lengths. To insert the letter K , a sequence Λ of length 1 with the content K is formed. If Λ 0 is now empty (frequency 1/2), we make Λ 0 and we're done. If not (as in the above example) (frequency 1/2), we merge Λ with Λ 0 with effort 1 + 1; the name of the result is again Λ. If Λ 1 is then empty (frequency 1/4), we make Λ 1 and we're done. If not (frequency 1/4), we mix Λ with Λ 1 with effort 2 + 2 and a new name Λ. If Λ 2 is then empty (as in the example above) (frequency 1/8), we make Λ 2 and we're done. If not (frequency 1/8), it continues as before. In the example above, inserting K results in :

 Λ 0 : empty λ 0 = 0 Λ 1 : empty λ 1 = 0 Λ 2 : [ D , E , H , K ] λ 2 = 1 Λ 3 : [ C , F , J , M , P , S , W , Y ] λ 3 = 1

The total effort is maximum

1/2 (1 + 1) + 1/4 (2 + 2) + ... + 2 - k 2 k = k + 1 in O (log n )
Result

With the presented data structure, the amortized cost for an insertion is in O (log n ) .

comment

They are therefore no better than with AVL or red-black trees , in which pure insertions (pure tree changes) are amortized and constant, but the search for the insert position with O (log n ) must still be added.
Remarkably, however, the insertion costs are lower than the associated pure search costs.

## Demarcation

In contrast to the account method , the aggregate method also equates the amortized costs of different types of operations. That is, the account method allows different amortized costs to be assigned to different types of operations. In addition, with the account method, the difference between amortized and real costs is posted to an account.

## literature

• Thomas H. Cormen, Charles E. Leiserson, Ronald L. Rivest, Clifford Stein: Introduction to Algorithms . 2nd Edition. MIT Press and McGraw-Hill, 2001, ISBN 0-262-03293-7 , pp. 406-410 .

## Individual evidence

1. Lecture 7: Amortized Analysis . Retrieved October 4, 2016.