# Misymmetric matrix

A skew-symmetrical matrix (also antisymmetrical matrix ) is a matrix that is equal to the negative of its transpose . In a body with characteristics other than two, the skew-symmetrical matrices are exactly the alternating matrices and are therefore often equated with them. Skewly symmetric matrices are used in linear algebra to characterize antisymmetric bilinear forms .

## definition

A matrix is called skew symmetric if ${\ displaystyle A}$

${\ displaystyle A ^ {T} = - A}$

applies. In other words: The matrix is skew symmetric if the following applies to its entries: ${\ displaystyle A}$

${\ displaystyle a_ {ij} = - a_ {ji} \ qquad \ forall i, j \ in \ {1, \ ldots, n \}}$

## example

The matrix is skew symmetrical, there ${\ displaystyle A = {\ begin {pmatrix} 0 & 7 & 23 \\ - 7 & 0 & -4 \\ - 23 & 4 & 0 \ end {pmatrix}}}$${\ displaystyle A ^ {T} = {\ begin {pmatrix} 0 & -7 & -23 \\ 7 & 0 & 4 \\ 23 & -4 & 0 \ end {pmatrix}} = - A}$

## properties

### Real skew-symmetric matrices

If is skew symmetric with real entries, then all diagonal entries are necessarily equal to 0. Furthermore, every eigenvalue is purely imaginary or equal to 0. ${\ displaystyle A \ in \ mathbb {R} ^ {n \ times n}}$

### Body characteristics not equal to 2

Properties body of characteristic different from 2: ${\ displaystyle K}$

• The entries on the main diagonal are zero.
• The determinant of skew-symmetric matrices with odd dimension n is due to and therefore${\ displaystyle A ^ {T} = - A \,}$
${\ displaystyle \ det (A) = \ det (A ^ {T}) = \ det (-A) = (- 1) ^ {n} \, \ det (A) = - \ det (A)}$
equals zero.
In general, this does not apply to matrices of even dimension, like the counterexample
${\ displaystyle A = {\ begin {pmatrix} 0 & 1 \\ - 1 & 0 \ end {pmatrix}}}$
shows. The matrix is ​​obviously skew symmetric, but in general the determinant can be determined as the square of the Pfaff determinant in this case .${\ displaystyle \ det (A) = 1.}$
• In a body with characteristics other than two, the skew-symmetrical matrices are precisely the alternating matrices . In a body with characteristic two, however, there are skew-symmetrical matrices that are not alternating.

### Vector space

The skew-symmetrical matrices form a vector space of dimension . If the body is, this vector space is called . The name comes from the fact that this vector space is the Lie algebra of the Lie group ( special orthogonal group ). ${\ displaystyle {\ tfrac {n (n-1)} {2}}}$ ${\ displaystyle K = \ mathbb {R}}$${\ displaystyle {\ mathfrak {so}} (n)}$ ${\ displaystyle \ operatorname {SO} (n)}$

The orthogonal projection from the space of the matrices into the space of the skew-symmetrical matrices is straight with respect to the Frobenius scalar product

${\ displaystyle {\ begin {matrix} \ operatorname {Pr}: & \ mathbb {R} ^ {n \ times n} & \ to & {\ mathfrak {s}} {\ mathfrak {o}} (n) \ \ & A & \ mapsto & {\ frac {1} {2}} (AA ^ {T}) \ end {matrix}}}$
${\ displaystyle A- \ operatorname {Pr} (A) = {\ frac {1} {2}} (A + A ^ {T}).}$

### Bilinear forms

The bilinear form to a skew-symmetric matrix is antisymmetric , that is, ${\ displaystyle B_ {A} (x, y) = x ^ {T} Ay}$${\ displaystyle A \ in K ^ {n \ times n}}$

${\ displaystyle B_ {A} (x, y) = - B_ {A} (y, x)}$

for everyone . If the main diagonal entries of a skew-symmetric matrix are all zero (i.e. if the matrix is ​​alternating), then the associated bilinear form is alternating , that is, ${\ displaystyle x, y \ in K ^ {n}}$${\ displaystyle A}$${\ displaystyle B_ {A}}$

${\ displaystyle B_ {A} (x, x) = 0}$

for everyone . Conversely, in a finite-dimensional vector space, the representation matrix of an antisymmetric or alternating bilinear form is always skew-symmetric with respect to any base , that is ${\ displaystyle x \ in K ^ {n}}$${\ displaystyle V}$ ${\ displaystyle A_ {B} = (B (b_ {i}, b_ {j}))}$${\ displaystyle B \ colon V \ times V \ to K}$ ${\ displaystyle \ {b_ {1}, \ ldots, b_ {n} \}}$

${\ displaystyle (A_ {B}) ^ {T} = - A_ {B}}$,

where the main diagonal entries are all zero. ${\ displaystyle A_ {B}}$

### Exponential mapping

The mapping defined by the matrix exponential

${\ displaystyle {\ begin {matrix} \ exp: & {\ mathfrak {s}} {\ mathfrak {o}} (n) & \ to & \ operatorname {SO} (n) \\ & A & \ mapsto & \ displaystyle \ sum _ {n = 0} ^ {\ infty} {\ frac {1} {n!}} A ^ {n} \ end {matrix}}}$

is surjective and just describes the exponential mapping on the unit matrix (see also special orthogonal group ). ${\ displaystyle I_ {n}}$

### Cross product

For the special case , skew-symmetrical matrices can be used to express the cross product as a matrix multiplication . The cross product of two vectors and can be used as a matrix multiplication of the skew-symmetric cross product matrix${\ displaystyle n = 3}$${\ displaystyle a \ in \ mathbb {R} ^ {3}}$${\ displaystyle b \ in \ mathbb {R} ^ {3}}$

${\ displaystyle [a] _ {\ times} = {\ begin {pmatrix} 0 & -a_ {3} & a_ {2} \\ a_ {3} & 0 & -a_ {1} \\ - a_ {2} & a_ {1 } & 0 \ end {pmatrix}}}$

can be expressed with the vector : ${\ displaystyle b}$

${\ displaystyle a \ times b = [a] _ {\ times} \ cdot b.}$

In this way, a formula with a cross product can be differentiated :

${\ displaystyle {\ frac {\ partial} {\ partial b}} (a \ times b) = [a] _ {\ times}}$

The exponential of the matrix can be represented as follows using Rodrigues' formula${\ displaystyle [a] _ {\ times}}$

{\ displaystyle {\ begin {aligned} \ exp (t [a] _ {\ times}) v & = {\ tfrac {\ langle a, v \ rangle} {\ | a \ | ^ {2}}} a + \ left (v - {\ frac {\ langle a, v \ rangle} {\ | a \ | ^ {2}}} a \ right) \ cos (\ | a \ | \, t) + \ left ({\ frac {1} {\ | a \ |}} a \ times v \ right) \ sin (\ | a \ | \, t) \\ & = v_ {a} + v_ {0} \ cdot \ cos (\ | a \ | \, t) + v_ {1} \ cdot \ sin (\ | a \ | \, t). \ end {aligned}}}

Here is

 ${\ displaystyle v_ {a}: = {\ frac {\ langle a, v \ rangle} {\ | a \ | ^ {2}}} a}$ the orthogonal projection of onto the straight line spanned by , ${\ displaystyle v}$${\ displaystyle a}$${\ displaystyle L_ {a}}$ ${\ displaystyle v_ {0}: = v-v_ {a}}$ the perpendicular perpendicular to the axis , ${\ displaystyle v}$${\ displaystyle L_ {a}}$ ${\ displaystyle v_ {1}: = {\ frac {1} {\ | a \ |}} a \ times v_ {0}}$ the vector that arises from rotation by 90 ° around the axis . ${\ displaystyle v_ {0}}$${\ displaystyle L_ {a}}$

Overall, the formula shows that the exponential of the cross product rotates the vector around the axis defined by , with the norm of as the angular velocity . ${\ displaystyle v}$${\ displaystyle a}$${\ displaystyle a}$