Hermitian matrix

from Wikipedia, the free encyclopedia

In mathematics, a Hermitian matrix is a complex square matrix that is equal to its adjoint matrix . The entries of a Hermitian matrix above the main diagonal result from mirroring the entries below the diagonal and subsequent complex conjugation ; the entries on the main diagonal itself are all real. Hermitian matrices are named after the mathematician Charles Hermite .

Hermitian matrices have a number of special properties. The sum of two Hermitian matrices is always Hermitian again. Every complex square matrix can be clearly written as the sum of a Hermitian and a skewed Hermitian matrix . The product of two Hermitian matrices is in turn Hermitian, provided that the two matrices commute . A Hermitian matrix is ​​always normal and self-adjoint , it has only real eigenvalues and it is always unitary diagonalizable . An important class of Hermitian matrices are positively definite matrices in which all eigenvalues ​​are positive. A Hermitian matrix with real entries is symmetric .

In linear algebra , Hermitian matrices are used to describe Hermitian sesquilinear forms . The representation matrix of a complex self-adjoint mapping with respect to an orthonormal basis is also always Hermitian. Linear systems of equations with a Hermitian coefficient matrix can be solved efficiently and numerically stable . Furthermore, Hermitian matrices are used in orthogonal projections and in the polar decomposition of matrices. Hermitian matrices have applications in quantum mechanics, among others .

definition

A complex square matrix is called Hermitian if for its entries

for applies. A Hermitian matrix therefore agrees with its adjoint matrix , that is, it holds

.

A matrix is ​​equivalent to this if and only if its transposed matrix is equal to its conjugate matrix , i.e.

applies. A Hermitian matrix is ​​therefore mirror-symmetrical with respect to its main diagonal , apart from the complex conjugation of all entries .

Examples

Examples of Hermitian matrices are ( represents the imaginary unit ):

.

In general, Hermitian matrices have the size , and structure

with real numbers on the main diagonal.

Algebraic properties

Entries

The diagonal entries of a Hermitian matrix are due to

always real. The matrix made up of the real parts of a Hermitian matrix is ​​always symmetrical because

,

and the matrix of the imaginary parts of a Hermitian matrix is ​​always skew symmetrical , because

.

Hence, a Hermitian matrix is ​​through

real numbers uniquely characterized. In comparison, a general complex matrix is ​​described by real numbers, i.e. just twice as many.

total

The sum of two Hermitian matrices is always Hermitian again, because

.

In addition, every complex square matrix can be uniquely written as the sum of a Hermitian matrix and a skewed Hermitian matrix by

  and  

to get voted.

Scalar multiplication

The product of a Hermitian matrix with a scalar is only Hermitian again if is real, because then applies

.

If is purely imaginary, then the product is lopsided Hermitian. The Hermitian matrices therefore do not form a sub-vector space in the -vector space of the complex square matrices, but only a sub-vector space in the -vector space of the complex square matrices. This subspace has the dimension , wherein the standard dies , , and , , in a basic form. In the space of the Hermitian matrices, the real symmetric matrices again form a sub-vector space.

product

The product of two Hermitian matrices is generally not Hermitian again. The product of Hermitian matrices is Hermitian if and only if and commutate , i.e. if applies, because then it results

.

In particular, for a Hermitian matrix , all of its powers with and therefore also its matrix exponential are Hermitian again. For any complex matrix , both the matrix and the matrix are always Hermitian.

normality

A Hermitian matrix is always normal because it holds

.

Every Hermitian matrix commutes with its adjoint. However, there are also normal matrices that are not Hermitian, for example crooked Hermitian matrices.

congruence

Any complex matrix that is congruent to a Hermitian matrix is also Hermitian because it holds

,

where is the associated transformation matrix. However, matrices that are similar to a Hermitian matrix do not necessarily have to be Hermitian as well.

Inverse

If a Hermitian matrix is invertible , then its inverse is also Hermitian again, because it is true

.

For a regular Hermitian matrix therefore all powers are with re-Hermitian.

Spectral properties

Self adjointness

A Hermitian matrix is always self-adjoint , because it holds with the complex standard scalar product

for all vectors . The converse is also true, and any complex self-adjoint matrix is ​​Hermitian.

Eigenvalues

The eigenvalues ​​of a Hermitian matrix , i.e. the solutions of the eigenvalue equation , are always real. Is in fact a complex eigenvalue of with associated eigenvector , then applies to the self-adjointness of

.

After for is, it must hold and the eigenvalue must therefore be real.

Multiplicities

For every Hermitian matrix, the algebraic and geometric multiples of all eigenvalues ​​are the same. If namely is an eigenvalue of with geometric multiplicity , then there is an orthonormal basis of the eigenspace of , which can be supplemented by to an orthonormal basis of the total space . With the unitary basic transformation matrix, the transformed matrix is ​​obtained

as a block diagonal matrix with the blocks and . For the entries of with , namely, with the self-adjointness of and the orthonormality of the basis vectors

,

where represents the Kronecker delta . Since, by assumption, there are no eigenvectors for the eigenvalue of , there can not be an eigenvalue of . According to the determinant formula for block matrices, the matrix therefore has the eigenvalue exactly with algebraic multiplicity and, due to the similarity of the two matrices, with it .

Diagonalisability

Since algebraic and geometric multiples of all eigenvalues ​​match in a Hermitian matrix and since eigenvectors are always linearly independent of different eigenvalues , a basis of can be formed from eigenvectors of . Hence a Hermitian matrix is ​​always diagonalizable , that is, there is a regular matrix and a diagonal matrix (even ) such that

applies. The matrix has the eigenvectors as columns and the matrix has the eigenvalues ​​associated with these eigenvectors on the diagonal. By permutating the eigenvectors, the order of the diagonal entries can be chosen as desired. Therefore two Hermitian matrices are similar to each other if and only if they have the same eigenvalues. Furthermore, two Hermitian matrices can be diagonalized simultaneously if and only if they commute.

Unitary diagonalisability

The eigenvectors for two different eigenvalues ​​of a Hermitian matrix are always orthogonal . It is again true with the self adjointness of

.

Since and were assumed to be different it then follows . Therefore, an orthonormal basis des can be formed from eigenvectors of . This means that a Hermitian matrix can even be unitarily diagonalized, that is, there is a unitary matrix with which

applies. This representation forms the basis for the principal axis transformation and is the simplest version of the spectral theorem .

Parameters

Due to the diagonalisability of a Hermitian matrix, the following applies to its track

and for their determinant accordingly

.

The trace and determinant of a Hermitian matrix are therefore always real. The rank of a Hermitian matrix is ​​equal to the number of eigenvalues ​​not equal to zero, i.e. with the Kronecker delta

.

A Hermitian matrix is invertible if and only if none of its eigenvalues ​​is zero. The spectral norm of a Hermitian matrix is

and thus equal to the spectral radius of the matrix. The Frobenius norm results from normality accordingly to

.

Estimates

According to the Courant-Fischer theorem , the Rayleigh quotient provides estimates for the smallest and the largest eigenvalue of a Hermitian matrix of the form

for everyone with . Equality applies precisely when an eigenvector is the respective eigenvalue. The smallest and the largest eigenvalue of a Hermitian matrix can accordingly be determined by minimizing or maximizing the Rayleigh quotient. The Gerschgorin circles , which have the form of intervals for Hermitian matrices, offer another possibility for estimating eigenvalues .

Definiteness

If it is a Hermitian matrix, then the expression becomes

called with square shape of . Depending on whether is greater than, greater than or equal to, less than or less than or equal to zero for all , the matrix is ​​called positive definite, positive semidefinite, negative definite or negative semidefinite. Can have both positive and negative signs, it is called indefinite. The definiteness of a Hermitian matrix can be determined from the signs of its eigenvalues. If all eigenvalues ​​are positive, the matrix is ​​positive definite, if they are all negative, the matrix is ​​negative definite, and so on. The triple consisting of the numbers of positive, negative and zero eigenvalues ​​of a Hermitian matrix is ​​called the signature of the matrix. According to Sylvester's theorem of inertia , the signature of a Hermitian matrix is preserved under congruence transformations .

use

Hermitian sesquilinear forms

If a -dimensional complex vector space , then every sesquilinear form can be defined by the representation matrix after choosing a basis for

describe. If the sesquilinear form is Hermitian , i.e. if it applies to all , then the representation matrix is also Hermitian. Conversely, every Hermitian matrix defines by means of

a Hermitian sesquilinear form . If a Hermitian matrix is also positive definite, then it represents a scalar product in unitary space .

Self-adjoint mappings

If a -dimensional complex scalar product space , then every linear mapping after choosing an orthonormal basis for can be determined by the mapping matrix

represent, where for is. The mapping matrix is now Hermitian if and only if the mapping is self-adjoint . This follows from

,

where and are.

Projections and reflections

Is again a -dimensional complex scalar product space and is a -dimensional sub-vector space of , where the coordinate vectors are an orthonormal basis for , then the orthogonal projection matrix is on this sub-vector space

as the sum of Hermitian rank-one matrices also Hermitian. The orthogonal projection matrix onto the complementary space is also always Hermitian due to the representation . Using the projection and each vector can be in mutually orthogonal vectors and disassemble. The reflection matrix on a subspace is also always Hermitian.

Systems of linear equations

Finding the solution to a linear system of equations with a Hermitian coefficient matrix is ​​simplified if the hermiticity of the coefficient matrix is ​​used. Due to the hermiticity, the coefficient matrix can be used as a product

write with a lower triangular matrix with all ones on the diagonal and a diagonal matrix . This decomposition is used, for example, in the Cholesky decomposition of positively definite Hermitian matrices in order to calculate the solution of the system of equations. Examples of modern methods for the numerical solution of large linear equation systems with sparse Hermitian coefficient matrices are the CG method and the MINRES method .

Polar decomposition

Every square matrix can also be used as a product by means of the polar decomposition

a unitary matrix and a positive semidefinite Hermitian matrix can be factored. The matrix is the square root of . If regular, then positive is definite and the polar decomposition is clearly with .

Quantum mechanics

The Pauli matrices used in quantum mechanics

are hermitian and without a trace . The Pauli matrices are, among others, to describe isospin - symmetry used. The Gell-Mann matrices are Hermitian matrices that are used in quantum chromodynamics .

See also

literature

  • Gerd Fischer : Linear Algebra. (An introduction for first-year students). 13th revised edition. Vieweg, Braunschweig et al. 2002, ISBN 3-528-97217-3 .
  • Roger A. Horn, Charles R. Johnson: Matrix Analysis . Cambridge University Press, 2012, ISBN 0-521-46713-6 .
  • Hans-Rudolf Schwarz, Norbert Köckler: Numerical Mathematics. 5th revised edition. Teubner, Stuttgart et al. 2004, ISBN 3-519-42960-8 .

Individual evidence

  1. Howard Anton, Chris Rorres: Elementary Linear Algebra: Applications Version . John Wiley & Sons, 2010, pp. 404-405 .

Web links