Consistency (numerics)

from Wikipedia, the free encyclopedia

In numerical mathematics , the consistency or the order of consistency is a property of a numerical method, which means that the algorithm actually solves the given problem in a certain fundamental way and not another.

The three critical error assessment mechanisms in numerics are condition , stability and consistency. All three variables analyze the origin of errors, but differ in the type of error sources. The condition evaluation assumes that the algorithm works exactly, but the input data is disturbed. The stability compares the result of the numerical procedure with that of the exact procedure under disturbed input data.

Consistency now deals with the question of what happens when the exact solution is processed in the numerical method. The examples given are numerical differentiation or the solution of an initial value problem. Here the resulting error is considered as a function of a selected grid or a selected step size.

definition

Given a continuous problem and the exact solution as well as the numerical solution to a step size . The procedure is called consistent, if there is a function with , so that applies to the local error (the procedure starts with exact initial data):

It has the order of consistency , if .

This means that you have an error limit at any point in time (or place) depending on the selected step size. It is clear that in practice methods only show this behavior if one chooses a sufficiently small step size (see stability ).

Many such consistency estimates are proven with the help of Taylor's theorem, for the simple reason that many methods develop the first terms of the Taylor series (which is dependent on a step size ) in order to represent the solution for the next point in time based on the solution at the current point in time :

.

The constant is then the remainder or a supremum norm estimate.

Definition in the case of the one-step procedure

We start from the initial value problem

where is the solution of the initial value problem and was generated by applying the method to with step size . In addition, let it be infinitely often differentiable and locally Lipschitz continuous .

A one-step procedure is called consistent if that holds for every right-hand side

ie constants exist such that

for everyone .

One method of consistency ordering makes a local error of ordering in every interval .

Examples

Differentiation

One possibility to calculate the derivative of a function at a point is to use difference quotients, provided that it can be differentiated sufficiently often. We consider two methods:

The simple difference quotient
and
the central difference quotient

The Taylor developments

then provide for the simple difference quotient

.

or for the central difference quotient

By rearranging and applying the norm in the image area of , we then get

or.

,

So consistency order one, or two, . It can be seen that with a similar computational effort (two function evaluations each and essentially one division), a higher order of consistency is achieved using the central difference quotient.

Ordinary differential equations

For the discrete solution of an initial value problem , one can use one-step methods , e.g. B. Runge-Kutta method use. Such a system has the consistency order p if it has the local error order p + 1 .

The simplest one-step method is the explicit Euler method (Euler's polygon method ).

This is the exact solution of a differential equation

and

numerically approximated by the piecewise linear function

with and

with and

One can also use the Taylor expansion here

the local error order 2 and thus the consistency of , d. i.e., consistency order 1 .

Individual evidence

  1. von Harrach, Bastian: Numerics of differential equations. July 25, 2017, p. 23 , accessed January 30, 2018 .