Orthogonal matrix

from Wikipedia, the free encyclopedia
By multiplying by an orthogonal matrix Q, vectors can be rotated (left) or mirrored (right).  The length of the vectors and the angle between the vectors are retained. By multiplying by an orthogonal matrix Q, vectors can be rotated (left) or mirrored (right).  The length of the vectors and the angle between the vectors are retained.
By multiplying by an orthogonal matrix Q , vectors can be rotated (left) or mirrored (right). The length of the vectors and the angle between the vectors are retained.

In linear algebra, an orthogonal matrix is a square, real matrix whose row and column vectors are orthonormal with respect to the standard scalar product. Thus the inverse of an orthogonal matrix is ​​also its transpose .

Orthogonal matrices represent congruence mappings in Euclidean space , i.e. rotations , reflections and combinations thereof. Each orthogonal map between two finite-dimensional scalar product spaces can be represented by an orthogonal matrix according to the choice of one orthonormal basis . The set of orthogonal matrices of a fixed size, together with the matrix multiplication as a link, form the orthogonal group .

Orthogonal matrices are used, for example, for the numerical solution of linear systems of equations or eigenvalue problems. The analogous term for complex matrices is the unitary matrix .

definition

A real square matrix is called orthogonal if the product with its transposed matrix results in the unit matrix , i.e.

applies. If the column vectors of the matrix are labeled with , then this condition is synonymous with the standard scalar product of two column vectors

results, where the Kronecker delta is. The column vectors of an orthogonal matrix thus form an orthonormal basis of the coordinate space . This also applies to the row vectors of an orthogonal matrix, because with is also orthogonal, that is

.

Even if the term “orthogonal matrix” could be understood in this way, it is not sufficient if the row or column vectors are only orthogonal in pairs ; they must also be standardized , i.e. have a length of one.

Examples

Concrete examples

  • The matrix
is orthogonal because it applies
.
  • The matrix too
is orthogonal because it applies
.

General examples

  • Permutation matrices , i.e. matrices in which exactly one entry per row and column is equal to one and all other entries are zero, are orthogonal. Denotes the permutation matrix belonging to a permutation, then applies
,
because the transposed permutation matrix is ​​equal to the permutation matrix of the inverse permutation , which reverses all swaps, and the product of permutation matrices corresponds to the successive execution of the permutations. The signed permutation matrices , in which exactly one entry is plus or minus one in each row and column and all other entries are zero, are exactly the integer orthogonal matrices.
the rotation matrix of a rotation through an angle that fixes the origin, then applies with the " trigonometric Pythagoras "
.
More generally, rotation matrices that describe a rotation in any plane of origin in -dimensional space are also orthogonal.
the reflection matrix of a reflection on a straight line through the origin with a unit normal vector , then applies
,
because reflection matrices are symmetrical by definition and holds for a unit vector . More generally, matrices that describe reflections on any sub-vector space in -dimensional space (for example a hyperplane ) are also orthogonal.

properties

Inverse

An orthogonal matrix is always regular due to the linear independence of its row and column vectors . The inverse of an orthogonal matrix is ​​equal to its transpose, that is, it applies

.

The inverse of a matrix is precisely the matrix for which

applies. It also follows from the second equation that the transpose of an orthogonal matrix is ​​orthogonal. The inverse is also true, and every matrix whose transpose is equal to its inverse is orthogonal, because then it is true

.

Length and angle accuracy

If a vector is multiplied by an orthogonal matrix , the length ( Euclidean norm ) of the vector does not change, that is

.

Furthermore, the standard scalar product of two vectors is invariant with respect to the multiplication with an orthogonal matrix , i.e.

.

This also maintains the angle between the two vectors. Both properties follow directly from the displacement property of the standard scalar product. Because of this length and angle accuracy, the linear mapping

represents a congruence mapping in Euclidean space. Conversely, the mapping matrix is orthogonal with respect to the standard base of every conformal linear mapping in Euclidean space. Because of the polarization formula , every image that is true to length is also true to angle.

Determinant

The following applies to the absolute value of the determinant of an orthogonal matrix

,

which with the help of the determinants product set over

follows. Thus the determinant of an orthogonal matrix can only take the values ​​one or minus one. However, there are also non-orthogonal matrices whose determinant is plus or minus one, for example unimodular matrices . Orthogonal matrices whose determinant is one correspond to rotations. One then speaks of an actually orthogonal matrix. Orthogonal matrices, the determinant of which is minus one, represent rotational reflections. One then speaks of an improperly orthogonal matrix.

Eigenvalues

The eigenvalues ​​of an orthogonal matrix are not necessarily all real. However, they have the complex amount one, so they are of the form

with . If there is an eigenvector that belongs to it, then a norm applies due to the length fidelity and the absolute homogeneity

and therefore . An orthogonal matrix therefore has at most the real eigenvalues . The complex eigenvalues ​​always appear complex conjugated in pairs , that is, with is also an eigenvalue, because

.

Accordingly, an orthogonal matrix of odd dimensions has at least one real eigenvalue (see also the football theorem ).

Diagonalisability

An orthogonal matrix is normal , that is, it holds

,

and thus unitarily diagonalizable over the complex numbers . According to the spectral theorem , there is a unitary matrix such that

holds, where is a diagonal matrix with the eigenvalues ​​of . The column vectors of are then pairwise orthonormal eigenvectors of . This means that the eigenspaces of an orthogonal matrix are also orthogonal in pairs.

In general, however, an orthogonal matrix is not really diagonalizable. However, there is an orthogonal matrix , so

results in a block diagonal matrix in which the individual blocks are either rotary matrices of size or of number or consist. This representation is also called the normal form of an orthogonal matrix.

Norms

The spectral norm of an orthogonal matrix is

.

For the Frobenius norm , the Frobenius scalar product applies accordingly

.

The product with an orthogonal matrix receives both the spectral norm and the Frobenius norm of a given matrix , because it holds

and

.

This means that the condition of a matrix with regard to these norms is retained after multiplication with an orthogonal matrix.

Orthogonal matrices as a group

The set of regular matrices of fixed size and the matrix multiplication as a link form a group , the general linear group . The identity matrix serves as a neutral element . The orthogonal matrices form a subgroup of the general linear group, the orthogonal group . The product of two orthogonal matrices is again orthogonal because it holds

.

Furthermore, the inverse of an orthogonal matrix is also orthogonal because it holds

.

The orthogonal matrices with determinant one, i.e. the rotation matrices, in turn form a subgroup of the orthogonal group, the rotation group (or special orthogonal group) . This is a Lie group ; H. the group operations are compatible with the differentiation in the group, and elements of can be represented as exponentials of matrices from the associated Lie algebra . The orthogonal matrices with determinant minus one, i.e. the rotational reflections, do not form a subgroup of the orthogonal group, but only a secondary class , because they lack the neutral element.

use

Systems of linear equations

The solution of linear systems of equations of the form

with an orthogonal matrix and a right hand side is numerically efficient through

to calculate. The determination of the solution therefore only requires a matrix-vector multiplication , which can be carried out with an effort of the order . In comparison, the solution of general linear equation systems, for example with the help of Gaussian elimination, requires an effort . This advantage is used, for example, in the (real) discrete Fourier transformation and the discrete cosine transformation .

Matrix decompositions

Another application of orthogonal matrices is the QR decomposition of a given matrix as a product

an orthogonal matrix and an upper triangular matrix . The construction of the matrix can be done with Givens rotations , which correspond to rotations, or household transformations , which correspond to reflections. QR decompositions are used in numerics to solve poorly conditioned, overdetermined or underdetermined linear systems of equations. Another field of application is the calculation of eigenvalue problems with the QR algorithm .

With the help of singular value decomposition , every real matrix can also be used as a product

an orthogonal matrix , a diagonal matrix and the transpose of another orthogonal matrix . The diagonal entries of the matrix are then the singular values ​​of . The singular value decomposition is used, for example, in geometry for the main axis transformation of quadrics and in statistics for the main component analysis of multivariate data sets.

A square matrix can also be used as a product by means of the polar decomposition

an orthogonal matrix and a positive semidefinite symmetric matrix .

Orthogonal mappings

If a -dimensional real scalar product space , then every linear mapping can be based on the choice of an orthonormal basis for through the mapping matrix

represent, where for is. The mapping matrix is now orthogonal if and only if is an orthogonal mapping . This follows from

,

where and are.

See also

literature

Web links