A skew-symmetrical matrix (also antisymmetrical matrix ) is a matrix that is equal to the negative of its transpose . In a body with characteristics other than two, the skew-symmetrical matrices are exactly the alternating matrices and are therefore often equated with them. Skewly symmetric matrices are used in linear algebra to characterize antisymmetric bilinear forms .
definition
A matrix is called skew symmetric if


applies. In other words: The matrix is skew symmetric if the following applies to its entries:


example
The matrix
is skew symmetrical, there

properties
Real skew-symmetric matrices
If is skew symmetric with real entries, then all diagonal entries are necessarily equal to 0. Furthermore, every eigenvalue is purely imaginary or equal to 0.

Body characteristics not equal to 2
Properties body of characteristic different from 2:

- The entries on the main diagonal are zero.
- The determinant of skew-symmetric matrices with odd dimension n is due to and therefore


- equals zero.
- In general, this does not apply to matrices of even dimension, like the counterexample

- shows. The matrix is obviously skew symmetric, but in general the determinant can be determined as the square of the Pfaff determinant in this case .

- In a body with characteristics other than two, the skew-symmetrical matrices are precisely the alternating matrices . In a body with characteristic two, however, there are skew-symmetrical matrices that are not alternating.
Vector space
The skew-symmetrical matrices form a vector space of dimension . If the body is, this vector space is called . The name comes from the fact that this vector space is the Lie algebra of the Lie group ( special orthogonal group ).


The orthogonal projection from the space of the matrices into the space of the skew-symmetrical matrices is straight
with respect to the Frobenius scalar product

The orthogonal complement is the symmetric matrix

Bilinear forms
The bilinear form to a skew-symmetric matrix is antisymmetric , that is,



for everyone . If the main diagonal entries of a skew-symmetric matrix are all zero (i.e. if the matrix is alternating), then the associated bilinear form is alternating , that is,



for everyone . Conversely, in a finite-dimensional vector space, the representation matrix of an antisymmetric or alternating bilinear form is always skew-symmetric with respect to any base , that is



-
,
where the main diagonal entries are all zero.

Exponential mapping
The mapping defined
by the matrix exponential

is surjective and just describes the exponential mapping on the unit matrix (see also special orthogonal group ).

Cross product
For the special case , skew-symmetrical matrices can be used to express the cross product as a matrix multiplication . The cross product of two vectors and
can be used as a matrix multiplication of the skew-symmetric cross product matrix

![{\ displaystyle [a] _ {\ times} = {\ begin {pmatrix} 0 & -a_ {3} & a_ {2} \\ a_ {3} & 0 & -a_ {1} \\ - a_ {2} & a_ {1 } & 0 \ end {pmatrix}}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/d62b92d6e81f53a938afbf687beda75dda13d448)
can be expressed with the vector :

![a \ times b = [a] _ {{\ times}} \ cdot b.](https://wikimedia.org/api/rest_v1/media/math/render/svg/bb2bf6fb9e235e36452f95fbffa182fc34629cb7)
In this way, a formula with a cross product can be differentiated :
![{\ frac {\ partial} {\ partial b}} (a \ times b) = [a] _ {{\ times}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/11704f908bf3b46e732ef4b61c86f2a4fc1de33a)
The exponential of the matrix can be represented as follows
using Rodrigues' formula![[a] _ {{\ times}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/ce5ff195e527c5888495f4a33d70e4ca7f73d48a)
![{\ begin {aligned} \ exp (t [a] _ {{\ times}}) v & = {\ tfrac {\ langle a, v \ rangle} {\ | a \ | ^ {2}}} a + \ left (v - {\ frac {\ langle a, v \ rangle} {\ | a \ | ^ {2}}} a \ right) \ cos (\ | a \ | \, t) + \ left ({\ frac 1 {\ | a \ |}} a \ times v \ right) \ sin (\ | a \ | \, t) \\ & = v_ {a} + v_ {0} \ cdot \ cos (\ | a \ | \, t) + v_ {1} \ cdot \ sin (\ | a \ | \, t). \ end {aligned}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/1ce50cf11cb9a500ed17bdf8f80bf04a6f688bd9)
Here is
 |
the orthogonal projection of onto the straight line spanned by ,
   |
 |
the perpendicular perpendicular to the axis ,
  |
|
the vector that arises from rotation by 90 ° around the axis .
  |
Overall, the formula shows that the exponential of the cross product rotates the vector around the axis defined by , with the norm of as the angular velocity .



See also
literature