SOR procedure

from Wikipedia, the free encyclopedia

The "Successive Over-Relaxation" method ( over relaxation method ) or SOR method is an algorithm in numerical mathematics for the approximate solution of linear systems of equations . It is, as the Gauss-Seidel method , and the Jacobi method , a special splitting method of the mold with .

Description of the procedure

A quadratic linear system of equations with equations and the unknown is given . Are there

The SOR method now solves this equation on the basis of a start vector according to the iteration rule

The real over-relaxation parameter ensures that the method converges faster than the Gauss-Seidel method , which is a special case of this formula .

algorithm

As an algorithm sketch with termination condition is possible:

choose
repeat
for up
next
to

An error limit was assumed as the input variable for the algorithm; the approximate solution is the vectorial return size . The error barrier measures the size of the last change in the variable vector.

In the case of sparsely populated matrices , the effort of the process per iteration is significantly reduced.

Derivation

The SOR method results from relaxation of the Gauß-Seidel method:

for you get Gauss-Seidel back. In order to analyze the process, the formulation is suitable as a splitting process, which allows the properties of the process to be analyzed. The matrix is written as the sum of a diagonal matrix and two strict triangular matrices and :

With

The system of linear equations is then equivalent to

with one . The iteration matrix is then

and the method is consistent and converges if and only if the spectral radius is.

The above formulation for the component-wise calculation of the can be obtained from this matrix representation if the triangular shape of the matrix is ​​used . This matrix can be inverted directly by inserting it forward .

convergence

It can be shown that for a symmetric positive definite matrix the procedure converges for each . In order to accelerate the convergence compared to the Gauss-Seidel method, heuristic values ​​between and are used . The optimal choice depends on the coefficient matrix . Values can be chosen to stabilize a solution that is otherwise slightly divergent.

Kahan's theorem (1958) shows that there is no longer any convergence outside the interval .

It can be shown that the optimal relaxation parameter is given by , where the spectral radius is the method matrix of the Jacobi method .

literature

  • Andreas Meister: Numerics of linear systems of equations . 3. Edition. Vieweg, 2007, ISBN 3-8348-0431-2

Web links

Individual evidence

  1. ^ Thomas Westermann: Modeling and Simulation. Springer, 2010.