Convergence speed

from Wikipedia, the free encyclopedia

The speed of convergence (also the order of convergence ) is understood to mean the speed with which the elements of a convergent sequence approach the limit value . In numerical mathematics , the speed of convergence is an important quality feature of iterative methods , in addition to the computational effort per iteration and numerical stability .

Convergence order

Let be a sequence with the limit . To avoid incidental case distinctions, terms with and other repetitions are omitted.

There is linear convergence if

.

Some authors refer to as the convergence rate (Engl. Rate of convergence, double. Taux de convergence ). The smaller the faster the sequence converges, that is to say: the fewer terms - at least asymptotically  - are required.

Under linear or sub-linear convergence is in front . If the sequence converges in a sub-linear manner and is also valid

,

then one speaks of logarithmic convergence.

Superlinear convergence is present , i.e. i.e., if there is a sequence of numbers converging to zero with:

A sequence that converges superlinearly will converge faster than linearly.

Convergence of order q (or Q-order of convergence (≥) q ) with occurs if converges and there exists such that

And formulations in the literature as "converged with the Q-order (at least) " ( English converges with Q-order at least q ) for the same facts. The Q comes from quotient because the Q-order is defined by the quotient of two consecutive terms. If the sequence converges with a Q-order , then it also converges with the Q-order for each with .

It is said that the sequence has the exact Q-order q if it is positive with

gives. The exact Q-order is unique if it exists:

For one speaks of quadratic convergence. Order convergence implies superlinear convergence (i.e. rate of convergence ) and superlinear convergence implies linear convergence.

Convergence of order means that in each iteration the number of exact decimal places (or the number of digits in any priority system ) is approximately comparable -facht is thus doubled, for example with a square convergence.

Convergence acceleration is mostly limited to power series that converge linearly. I. d. Usually only the convergence rate (and not the Q-order ) improves, which can nevertheless mean a significant reduction in the total effort (with possibly greater effort per iteration). Procedures of orders> 1 do not exist for every problem class. In the case of iteration methods , stability properties must also be observed.

Examples

converges logarithmically.
converges sublinearly.
converges linearly with the rate of convergence .
has Q convergence order for all .
  • The zero sequence with
, well ,
converges squarely.
  • The Newton method converges in a simple zero square. Simplified variants of Newton's method converge more slowly, partly superlinearly, partly with first order. Compared to the Newton method, an iteration step is possibly significantly cheaper.
  • Fixed point methods whose convergence is proven with Banach's Fixed Point Theorem (for example splitting methods ) have at least a linear speed of convergence.
  • The secant method has a broken order of convergence ( golden section ), in particular it converges superlinearly.

Comparative convergence speed

In order to describe the speed of convergence with which a sequence converges towards the limit value , one compares the speed of convergence of the zero sequence with other zero sequences whose speed of convergence is known, e.g. B. , for , for or .

definition

A null sequence is said to converge at least as quickly as a null sequence if holds .

A sequence is called rapidly falling if it falls faster than any polynomial sequence with natural , i.e. for each (an example is the sequence .)

The description of the convergence order of numerical procedures in normalized spaces is of particular interest, i.e. where the sequence members have the form .

For the purposes of this definition, an iteration method is called linearly convergent if it converges as fast as the sequence for a . It is called quadratically convergent if it converges as fast as the sequence . Higher orders of convergence (cubic, superlinear) can also be defined.

Any slow convergence

Many important numerical methods converge arbitrarily slowly . Let, for example, be a Banach space , a sequence of -dimensional subspaces and a method that delivers an approximate solution in for every solution of an equation . The procedure then is arbitrarily slowly convergent if for every positive zero sequence one there, so that the zero sequence with the corresponding approximate solutions and slower than the sequence converges.

If you put z. If, for example, in numerical integration only the continuity of the function to be integrated is required, but not a certain order of differentiability , then the method of numerical integration converges as slowly as desired. That is, for every monotonic sequence of positive numbers that converges slowly to 0 there is a continuous function , so that the sequence of quadrature values ​​converges more slowly to the integral than the given zero sequence. Other examples can be found in interpolation or in solving incorrectly posed problems .

Reverse results

In several disciplines of analysis , knowledge about the structure of the problem to be examined can be gained from the speed of convergence. , Eg, the amber rates from approximation theory mentioned: Is a continuous function by polynomials of degree with the convergence speed for a approximated, it is -fold differentiable.

See also

literature

  • Martin Hanke-Bourgeois: Fundamentals of Numerical Mathematics and Scientific Computing. Teubner, Stuttgart 2002.
  • Arnold Schönhage: Approximation Theory. de Gruyter, Berlin 1971.
  • Eberhard Schock: Arbitrarily Slow Convergence, Uniform Convergence and Superconvergence of Galerkin-like Methods. IMA J. Numer. Analysis 5: 153-160 (1985).
  • Hans R. Schwarz, Norbert Köckler: Numerical Mathematics. 5th edition. Teubner, Stuttgart 2004.

Individual evidence

  1. ^ FA Potra: On Q-order and R-order of convergence . In: J. Optim. Th. Appl. . 63, No. 3, 1989, pp. 415-431. doi : 10.1007 / BF00939805 .