Gram's determinant

from Wikipedia, the free encyclopedia

In matrix calculation, one can only define determinants of square matrices as a measure of the change in volume of their mapping. For non-square matrices there are minor and Gram determinants (according to Jørgen Pedersen Gram ) that do similar things.


For all matrices with one calls the Gram determinant. The following applies: is never negative for and if and only if , that is, if the columns are linearly dependent on. The Gram determinant can also be written according to Binet-Cauchy's theorem as the sum over the square of all maximal minors.

Gram's matrix

For the entries of the matrix are the canonical scalar products of the columns of . To do this, consider the following generalization:

Let a bilinear form be defined on a -dimensional K-vector space with the base . Then it is called the matrix

the Gram's matrix belonging to the bilinear form , or the matrix representing the bilinear form. The latter is completely determined by the entries in the Gram matrix. The bilinear form is a scalar product if and only if symmetric and positive is definite .

If a scalar product is an arbitrary set of vectors , then one denotes as the Gram matrix of . An important application in this case is the criterion of linear independence: the vectors are linearly independent if and only if their Gram's determinant (determinant of the Gram matrix) is not zero. Since the Gramian determinant is nonnegative in this case, one can pull the root from it and run through

the dimensional volume of through spanned Spates explain.