In the mathematical field of linear algebra, a diagonalizable matrix is a square matrix that is similar to a diagonal matrix . It can be transformed into a diagonal matrix by means of a base change ( i.e. conjugation with a regular matrix ). The concept can be transferred to endomorphisms .
definition
Diagonal matrix
A square matrix over a body , the elements of which are all zero, is called a diagonal matrix. Often you write for it




-
.
Diagonalizable matrix
A square -dimensional matrix is called diagonalizable or diagonal-like if there is a diagonal matrix to which it is similar , that is, there is a regular matrix such that or .





An endomorphism over a finite-dimensional vector space is called diagonalizable if there is a basis of with respect to which the mapping matrix is a diagonal matrix.


Unitary diagonalizable matrix
A matrix is unitary diagonalizable if and only if a unitary transformation matrix exists such that is a diagonal matrix, where is the matrix to be adjoint .




From this follows for a real-valued matrix the unitary diagonalizability, if an orthogonal transformation matrix exists such that is a diagonal matrix, where is the matrix to be transposed .




In a finite-dimensional Prehilbert space , an endomorphism can be unitarily diagonalized if and only if there is an orthonormal basis of such that the mapping matrix is a diagonal matrix.




Further characterizations of the diagonalisability
Let be a -dimensional matrix with entries from a body . Each of the following six conditions is met if and only if is diagonalizable.




- The minimal polynomial completely breaks down into different linear factors in pairs: with


- The characteristic polynomial completely breaks down into linear factors and the geometric multiplicity corresponds to the algebraic multiplicity for each eigenvalue .


- There is a basis for which consists of eigenvectors for .


- The sum of the dimensions of the respective natural spaces is equal to : where the spectrum designated.



-
is the direct sum of the respective natural spaces: .
- All Jordan blocks of Jordan normal form have dimension 1.

If and with the desired properties are found, then it holds that the diagonal entries of , namely , are eigenvalues of too certain unit vectors . Then is . The so are eigenvectors of , in each case to the eigenvalue .










Since it should be invertible, it is also linearly independent.


In summary, this results in the necessary condition that a -dimensional diagonalizable matrix must have linearly independent eigenvectors. The space on which it operates has a basis of eigenvectors of the matrix. However, this condition is also sufficient, because from found linearly independent eigenvectors of with the associated eigenvalues suitable and very direct can be constructed.






The problem is reduced to finding linearly independent eigenvectors of .


A necessary but not sufficient condition for diagonalisability is that the characteristic polynomial completely breaks down into linear factors: So it is not diagonalisable, although . A sufficient, but not necessary, condition for diagonalisability is that it completely breaks down into pairwise different linear factors: So is diagonalisable, although .






Properties of a diagonalizable matrix
If a matrix is diagonalizable, the geometric multiplicity of its eigenvalues is equal to the respective algebraic multiplicity . This means that the dimension of the individual eigenspaces corresponds to the algebraic multiplicity of the corresponding eigenvalues in the characteristic polynomial of the matrix.
The matrix power of a diagonalizable matrix can be calculated by


The power of a diagonal matrix is obtained by raising the diagonal elements to the power.
Diagonalization
If a matrix can be diagonalized, there is a diagonal matrix for which the similarity condition is fulfilled:



To diagonalize this matrix, one calculates the diagonal matrix and a corresponding basis from eigenvectors. This is done in three steps:

- The eigenvalues of the matrix are determined. (Individual eigenvalues can occur more than once.)


- The eigenspaces for all eigenvalues are calculated, i.e. systems of equations of the following form are solved


-
.
- Because the geometric multiplicity is equal to the algebraic multiplicity of any eigenvalue, we can find a basis of for any maximal set of corresponding eigenvalues .



- Now the diagonal form of the matrix with respect to the base is :





Simultaneous diagonalization
Occasionally you want to diagonalize two matrices with the same transformation . If that succeeds, then
and and there and are diagonal matrices,






-
.
So the endomorphisms have to commute with one another. In fact, the reverse also applies: if two diagonalizable endomorphisms commutate, they can be diagonalized simultaneously. In quantum mechanics there is a basis of common eigenstates for two such operators.
example
Let be the matrix to be diagonalized. is (unitary) diagonalizable, since is symmetrical , d. H. it applies .



The eigenvalues of can be determined by the zeros of the characteristic polynomial :



So . The eigenvalue 2 has algebraic multiplicity because it is the double zero of the characteristic polynomial.

To determine the eigenspaces, insert the eigenvalues in .

All with get, we summarize the extended coefficient matrix as a system of linear equations on with infinite solutions.



For we get , with the Gaussian elimination method we get and thus as a solution set the eigenspace:



-
,
where denotes the linear envelope .

For we get , from this
and thus as a solution set, the eigenspace:



-
.
We get the eigenvectors from the bases of the eigenspaces, they form a basis of .


If we normalize with and we get an orthonormal basis , since symmetric and the eigenvectors of the semi-simple eigenvalues are orthogonal to each other (in this case ).



So it applies . From this we get the inverse using the properties of orthonormal bases .

determined by .

So we get for

and thus the diagonalization
-
.
See also
Individual evidence
-
↑ Uwe Storch , Hartmut Wiebe: Textbook of Mathematics, Volume 2: Lineare Algebra. BI-Wissenschafts-Verlag, Mannheim et al. 1990, ISBN 3-411-14101-8 .