A skew-symmetrical matrix (also antisymmetrical matrix ) is a matrix that is equal to the negative of its transpose . In a body with characteristics other than two, the skew-symmetrical matrices are exactly the alternating matrices and are therefore often equated with them. Skewly symmetric matrices are used in linear algebra to characterize antisymmetric bilinear forms .
definition
A matrix is called skew symmetric if
![A.](https://wikimedia.org/api/rest_v1/media/math/render/svg/7daff47fa58cdfd29dc333def748ff5fa4c923e3)
![A ^ {T} = - A](https://wikimedia.org/api/rest_v1/media/math/render/svg/569b75cb33278c63f2a0f0d5755e1e685affd2ee)
applies. In other words: The matrix is skew symmetric if the following applies to its entries:
![A.](https://wikimedia.org/api/rest_v1/media/math/render/svg/7daff47fa58cdfd29dc333def748ff5fa4c923e3)
![a _ {{ij}} = - a _ {{ji}} \ qquad \ forall i, j \ in \ {1, \ ldots, n \}](https://wikimedia.org/api/rest_v1/media/math/render/svg/5067c33173acaf6ed1d4b9523bba33637cbe27ec)
example
The matrix
is skew symmetrical, there
![A = {\ begin {pmatrix} 0 & 7 & 23 \\ - 7 & 0 & -4 \\ - 23 & 4 & 0 \ end {pmatrix}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/71b86f9e2758893402bdd179b7e5938370306934)
properties
Real skew-symmetric matrices
If is skew symmetric with real entries, then all diagonal entries are necessarily equal to 0. Furthermore, every eigenvalue is purely imaginary or equal to 0.
![A \ in {\ mathbb {R}} ^ {{n \ times n}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/ebb63eca276c71633e0939f5c5d4ef4bfc565518)
Body characteristics not equal to 2
Properties body of characteristic different from 2:
![K](https://wikimedia.org/api/rest_v1/media/math/render/svg/2b76fce82a62ed5461908f0dc8f037de4e3686b0)
- The entries on the main diagonal are zero.
- The determinant of skew-symmetric matrices with odd dimension n is due to and therefore
![A ^ {T} = - A \,](https://wikimedia.org/api/rest_v1/media/math/render/svg/495dbf44c7b2f4aaa5dff492cb7884b9d723dd00)
![\ det (A) = \ det (A ^ {T}) = \ det (-A) = (- 1) ^ {n} \, \ det (A) = - \ det (A)](https://wikimedia.org/api/rest_v1/media/math/render/svg/87f61073f9960c2a104e042aa902ebd7489371ab)
- equals zero.
- In general, this does not apply to matrices of even dimension, like the counterexample
![{\ displaystyle A = {\ begin {pmatrix} 0 & 1 \\ - 1 & 0 \ end {pmatrix}}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/a88512354a28d77e7f3aa2beb732024b066df097)
- shows. The matrix is obviously skew symmetric, but in general the determinant can be determined as the square of the Pfaff determinant in this case .
![\ det (A) = 1.](https://wikimedia.org/api/rest_v1/media/math/render/svg/2c6369edf44e067004540ca9e8041c7cea1fa9fa)
- In a body with characteristics other than two, the skew-symmetrical matrices are precisely the alternating matrices . In a body with characteristic two, however, there are skew-symmetrical matrices that are not alternating.
Vector space
The skew-symmetrical matrices form a vector space of dimension . If the body is, this vector space is called . The name comes from the fact that this vector space is the Lie algebra of the Lie group ( special orthogonal group ).
![K = \ mathbb {R}](https://wikimedia.org/api/rest_v1/media/math/render/svg/a6419d3aa99701ca996737b17a5e1174d53e6c9e)
![\ operatorname {SO} (n)](https://wikimedia.org/api/rest_v1/media/math/render/svg/79c49648527ab4aacf6c03c15633727606cc7d22)
The orthogonal projection from the space of the matrices into the space of the skew-symmetrical matrices is straight
with respect to the Frobenius scalar product
![{\ begin {matrix} \ operatorname {Pr}: & \ mathbb {R} ^ {{n \ times n}} & \ to & {\ mathfrak s} {\ mathfrak o} (n) \\ & A & \ mapsto & {\ frac 12} (AA ^ {T}) \ end {matrix}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/c7a7cb014d7d7b7f463d3ea198514ea2ec03b9c6)
The orthogonal complement is the symmetric matrix
![A- \ operatorname {Pr} (A) = {\ frac 12} (A + A ^ {T}).](https://wikimedia.org/api/rest_v1/media/math/render/svg/9e3eb5fa2fb8bbfb5096e36ed53a0cae2f49198e)
Bilinear forms
The bilinear form to a skew-symmetric matrix is antisymmetric , that is,
![B_ {A} (x, y) = x ^ {T} Ay](https://wikimedia.org/api/rest_v1/media/math/render/svg/3451a414281dfac64eea0c2356de854dde12fdb7)
![A \ in K ^ {n \ times n}](https://wikimedia.org/api/rest_v1/media/math/render/svg/031c38cf37708983337c13ecda1e9968a9931b43)
![B_ {A} (x, y) = - B_ {A} (y, x)](https://wikimedia.org/api/rest_v1/media/math/render/svg/2e06bac871e233629592e78ecb76d5f233cd92b4)
for everyone . If the main diagonal entries of a skew-symmetric matrix are all zero (i.e. if the matrix is alternating), then the associated bilinear form is alternating , that is,
![x, y \ in K ^ n](https://wikimedia.org/api/rest_v1/media/math/render/svg/ff52d2914fb3c590e237e97c19f6d0732c6c8890)
![A.](https://wikimedia.org/api/rest_v1/media/math/render/svg/7daff47fa58cdfd29dc333def748ff5fa4c923e3)
![B_ {A} (x, x) = 0](https://wikimedia.org/api/rest_v1/media/math/render/svg/f8af655ab6076bfb9805917da15b91bcac36cd8e)
for everyone . Conversely, in a finite-dimensional vector space, the representation matrix of an antisymmetric or alternating bilinear form is always skew-symmetric with respect to any base , that is
![x \ in K ^ n](https://wikimedia.org/api/rest_v1/media/math/render/svg/567099788cbfb81999a3af4b8b2f1cfc4922c5d7)
![A_ {B} = (B (b_ {i}, b_ {j}))](https://wikimedia.org/api/rest_v1/media/math/render/svg/cea2aee8deeb0d5e6c0da8766ef9d2c5870322c7)
![\ {b_1, \ ldots, b_n \}](https://wikimedia.org/api/rest_v1/media/math/render/svg/87b3f44bd15336e442d22a600b041d9b6d8b3117)
-
,
where the main diagonal entries are all zero.
![FROM}](https://wikimedia.org/api/rest_v1/media/math/render/svg/40c23902854ca17ed340b014fda4b3e6adc02b46)
Exponential mapping
The mapping defined
by the matrix exponential
![{\ begin {matrix} \ exp: & {\ mathfrak s} {\ mathfrak o} (n) & \ to & \ operatorname {SO} (n) \\ & A & \ mapsto & \ displaystyle \ sum _ {{n = 0}} ^ {\ infty} {\ frac 1 {n!}} A ^ {n} \ end {matrix}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/404f81a88eb86d0e85bbcb881a4bc310b1b2a8c0)
is surjective and just describes the exponential mapping on the unit matrix (see also special orthogonal group ).
![In](https://wikimedia.org/api/rest_v1/media/math/render/svg/aba34f081d776e30204f3458e4f50b403b09e5c6)
Cross product
For the special case , skew-symmetrical matrices can be used to express the cross product as a matrix multiplication . The cross product of two vectors and
can be used as a matrix multiplication of the skew-symmetric cross product matrix![n = 3](https://wikimedia.org/api/rest_v1/media/math/render/svg/1c5a5a42ced00df920fad4ab2d4acdb960a4105b)
![a \ in {\ mathbb {R}} ^ {3}](https://wikimedia.org/api/rest_v1/media/math/render/svg/05a4205645bf4649083f7a5fc7724c94ec3e8265)
![{\ displaystyle [a] _ {\ times} = {\ begin {pmatrix} 0 & -a_ {3} & a_ {2} \\ a_ {3} & 0 & -a_ {1} \\ - a_ {2} & a_ {1 } & 0 \ end {pmatrix}}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/d62b92d6e81f53a938afbf687beda75dda13d448)
can be expressed with the vector :
![b](https://wikimedia.org/api/rest_v1/media/math/render/svg/f11423fbb2e967f986e36804a8ae4271734917c3)
![a \ times b = [a] _ {{\ times}} \ cdot b.](https://wikimedia.org/api/rest_v1/media/math/render/svg/bb2bf6fb9e235e36452f95fbffa182fc34629cb7)
In this way, a formula with a cross product can be differentiated :
![{\ frac {\ partial} {\ partial b}} (a \ times b) = [a] _ {{\ times}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/11704f908bf3b46e732ef4b61c86f2a4fc1de33a)
The exponential of the matrix can be represented as follows
using Rodrigues' formula![[a] _ {{\ times}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/ce5ff195e527c5888495f4a33d70e4ca7f73d48a)
![{\ begin {aligned} \ exp (t [a] _ {{\ times}}) v & = {\ tfrac {\ langle a, v \ rangle} {\ | a \ | ^ {2}}} a + \ left (v - {\ frac {\ langle a, v \ rangle} {\ | a \ | ^ {2}}} a \ right) \ cos (\ | a \ | \, t) + \ left ({\ frac 1 {\ | a \ |}} a \ times v \ right) \ sin (\ | a \ | \, t) \\ & = v_ {a} + v_ {0} \ cdot \ cos (\ | a \ | \, t) + v_ {1} \ cdot \ sin (\ | a \ | \, t). \ end {aligned}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/1ce50cf11cb9a500ed17bdf8f80bf04a6f688bd9)
Here is
![v_ {a}: = {\ frac {\ langle a, v \ rangle} {\ | a \ | ^ {2}}} a](https://wikimedia.org/api/rest_v1/media/math/render/svg/b24a7014f7669f68a6216ff84046a4d278c7b784) |
the orthogonal projection of onto the straight line spanned by ,
![v](https://wikimedia.org/api/rest_v1/media/math/render/svg/e07b00e7fc0847fbd16391c778d65bc25c452597) ![a](https://wikimedia.org/api/rest_v1/media/math/render/svg/ffd2487510aa438433a2579450ab2b3d557e5edc) ![L_ {a}](https://wikimedia.org/api/rest_v1/media/math/render/svg/87a7bf511194a88867e63c66594f8c06f106bbbb) |
![v_ {0}: = v-v_ {a}](https://wikimedia.org/api/rest_v1/media/math/render/svg/7000eb10da6fb6138ee8c7b89e45b2c6ee4fd727) |
the perpendicular perpendicular to the axis ,
![v](https://wikimedia.org/api/rest_v1/media/math/render/svg/e07b00e7fc0847fbd16391c778d65bc25c452597) ![L_ {a}](https://wikimedia.org/api/rest_v1/media/math/render/svg/87a7bf511194a88867e63c66594f8c06f106bbbb) |
|
the vector that arises from rotation by 90 ° around the axis .
![v_ {0}](https://wikimedia.org/api/rest_v1/media/math/render/svg/60faad24775635f4722ccc438093dbbfe05f34ae) ![L_ {a}](https://wikimedia.org/api/rest_v1/media/math/render/svg/87a7bf511194a88867e63c66594f8c06f106bbbb) |
Overall, the formula shows that the exponential of the cross product rotates the vector around the axis defined by , with the norm of as the angular velocity .
![v](https://wikimedia.org/api/rest_v1/media/math/render/svg/e07b00e7fc0847fbd16391c778d65bc25c452597)
![a](https://wikimedia.org/api/rest_v1/media/math/render/svg/ffd2487510aa438433a2579450ab2b3d557e5edc)
![a](https://wikimedia.org/api/rest_v1/media/math/render/svg/ffd2487510aa438433a2579450ab2b3d557e5edc)
See also
literature