Taylor method

from Wikipedia, the free encyclopedia

The Taylor method is a one-step method in numerics . It is a way of constructing higher order difference formulas using the Taylor expansion .

Derivation

Based on an initial value problem ( AWA) of the form: and the Taylor formula , the scalar case is considered.

Since the differential equation is sufficient, the following applies

The -step Taylor procedures are then

The Taylor method has the consistency order (numerics)

Numerical stability

We apply the test equation to the procedure:

The gain factor is accordingly

Individual evidence

  1. ^ Rolf Rannacher: Numerics 1. Numerics of ordinary differential equations . Heidelberg 2017, p. 46 ff .