# Graßmann algebra

The Graßmann algebra or outer algebra of a vector space is an associative, skew-symmetrically graduated algebra with one element. It is - depending on the definition - a subalgebra or a factor algebra of an antisymmetrized tensor algebra of and is represented by . The multiplication is called an outer product , wedge product , roof product , or wedge product . A special case of this product is related to the cross product . This calculus is used not only in elementary linear algebra (for example in the theory of determinants ), but above all in algebraic geometry and differential geometry as the algebra of differential forms. In this form, the theory of alternating differential forms goes back to Élie Cartan , who thus unified the existing concepts of area theory. Anti-commutative products of vectors as well as abstract vector spaces in general were first considered by Hermann Graßmann in 1846 . ${\ displaystyle V}$${\ displaystyle V}$${\ displaystyle \ Lambda V}$

## definition

### External potency

Let it be a vector space over a body . Be further ${\ displaystyle V}$${\ displaystyle K}$

${\ displaystyle T ^ {k} (V) = \ underbrace {V \ otimes \ cdots \ otimes V} _ {k {\ text {times}}}}$

(with the conventions and ). The subspace is generated by elementary tensors for which two factors are equal: ${\ displaystyle T ^ {0} (V) = K}$${\ displaystyle T ^ {1} (V) = V}$ ${\ displaystyle J ^ {k} (V) \ subseteq T ^ {k} (V)}$

${\ displaystyle J ^ {k} (V): = \ mathrm {span} \ left \ {v_ {1} \ otimes \ cdots \ otimes v_ {k} {\ Big |} \; \ exists i, j \ in \ {1, \ dots, k \}; \, i \ neq j \ \ colon v_ {i} = v_ {j} \ right \}}$

The -th external power is then defined as the quotient space${\ displaystyle k}$

${\ displaystyle \, \ Lambda ^ {k} (V) = T ^ {k} (V) / J ^ {k} (V)}$.

### External algebra

The direct sum

${\ displaystyle J (V) = \ bigoplus _ {k = 0} ^ {\ infty} J ^ {k} (V)}$

is a two-sided, homogeneous ideal in tensor algebra

${\ displaystyle T (V) = \ bigoplus _ {k = 0} ^ {\ infty} T ^ {k} (V).}$

The outer algebra is the factor algebra

${\ displaystyle \, \ Lambda (V): = T (V) / J (V).}$

Regarded as a vector space, this is isomorphic to

${\ displaystyle \ bigoplus _ {k = 0} ^ {\ infty} \ Lambda ^ {k} (V) = \ bigoplus _ {k = 0} ^ {\ infty} T ^ {k} (V) / J ^ {k} (V).}$

For is . ${\ displaystyle k> \ dim V}$${\ displaystyle \ Lambda ^ {k} (V) = \ {0 \}}$

The product in outer algebra is traditionally written as. ${\ displaystyle a \ wedge b}$

Similarly, one can define the outer algebra of modules using commutative rings.

### Alternating tensors

In addition to the definition of the outer algebra given above, there are other equivalent options for defining the outer algebra. For example, one can understand the elements of external algebra as alternating tensors . In the following, let the body characteristic be 0. ${\ displaystyle K}$

The symmetrical group operates on the homogeneous components . A tensor is called alternating if ${\ displaystyle T ^ {k} (V)}$ ${\ displaystyle S_ {k}}$${\ displaystyle t \ in T ^ {k} (V)}$

${\ displaystyle \ sigma (t) = \ operatorname {sgn} (\ sigma) \ cdot t}$

holds for all permutations ( is the sign of the permutation). Let the vector space of the alternating tensors of the degree be . ${\ displaystyle \ sigma \ in S_ {k}}$${\ displaystyle \ operatorname {sgn} (\ sigma)}$${\ displaystyle k}$${\ displaystyle A ^ {k} (V) \ subseteq T ^ {k} (V)}$

One can assign an alternating tensor to each tensor in a canonical way with the help of the antisymmetrization mapping (also called “alternator”) . It is defined by ${\ displaystyle \ operatorname {Alt} _ {k} \ colon T ^ {k} (V) \ rightarrow A ^ {k} (V)}$

${\ displaystyle e_ {1} \ otimes \ dotsb \ otimes e_ {k} \ mapsto {\ frac {1} {k!}} \ sum _ {\ sigma \ in S_ {k}} \ operatorname {sgn} (\ sigma) (e _ {\ sigma (1)} \ otimes \ dotsb \ otimes e _ {\ sigma (k)}).}$

She is a projection on . The factor ensures that it is the identity mapping, i.e. mapping alternating tensors on itself. ${\ displaystyle A ^ {k} (V)}$${\ displaystyle 1 / k!}$${\ displaystyle A ^ {k} (V)}$

With the product

${\ displaystyle a \ wedge b = {\ frac {(k + l)!} {k! \, l!}} \ operatorname {Alt} _ {k + l} (a \ otimes b)}$

for and bilinear continuation an associative, anti-commutative graded algebra arises in the space of the alternating tensors. The canonical mapping is an algebra isomorphism . ${\ displaystyle a \ in A ^ {k} (V), b \ in A ^ {l} (V)}$${\ displaystyle \ textstyle A (V) = \ bigoplus _ {k = 0} ^ {\ infty} A ^ {k} (V)}$${\ displaystyle A (V) \ to \ Lambda (V)}$

## properties

This section deals with the essential properties of external algebra such as its graduation and the universal property and its product. It is always assumed that there is a -dimensional vector space. ${\ displaystyle V}$${\ displaystyle n}$

### External product

The product of outer algebra is associative . It is also commutatively graded, that is, it applies ${\ displaystyle \ wedge}$

${\ displaystyle a \ wedge b = (- 1) ^ {kl} b \ wedge a}$

for and . In particular is for everyone , but in general is for with straight. ${\ displaystyle a \ in \ Lambda ^ {k} (V)}$${\ displaystyle b \ in \ Lambda ^ {l} (V)}$${\ displaystyle v \ wedge v = 0}$${\ displaystyle v \ in V}$${\ displaystyle a \ wedge a \ neq 0}$${\ displaystyle a \ in \ Lambda ^ {k} (V)}$${\ displaystyle k}$

In the terminology of Super geometry is used instead of commutative-grading the equivalent term superkommutativ and using the Superkommutators the condition of Superkommutativität can be expressed as ${\ displaystyle [{\ cdot}, {\ cdot}]}$

${\ displaystyle [a, b] = 0}$

for and . ${\ displaystyle a \ in \ Lambda ^ {k} (V)}$${\ displaystyle b \ in \ Lambda ^ {l} (V)}$

Is a - form and a form, this is the explicit formula for the outer product of and for any finite dimensional vector spaces (and infinite-Banach): ${\ displaystyle f}$${\ displaystyle p}$${\ displaystyle g}$${\ displaystyle q}$${\ displaystyle f}$${\ displaystyle g}$

${\ displaystyle (f \ wedge g) (v_ {1}, \ ldots, v_ {p}, v_ {p + 1}, \ ldots, v_ {p + q}) = {\ frac {1} {p! q!}} \ sum _ {\ sigma \ in Sym_ {p + q}} \ operatorname {sgn} (\ sigma) f (v _ {\ sigma (1)}, \ ldots, v _ {\ sigma (p)} ) g (v _ {\ sigma (p + 1)}, \ ldots, v _ {\ sigma (p + q)})}$,

where the symmetric group should represent the order and the sign of the permutation . ${\ displaystyle Sym_ {p + q}}$${\ displaystyle p + q}$${\ displaystyle \ operatorname {sgn} (\ sigma)}$${\ displaystyle \ sigma}$

The outer algebra

${\ displaystyle \ Lambda (V) = \ bigoplus _ {m = 0} ^ {n} \ Lambda ^ {m} (V)}$

is a graduate algebra . That is, it can be represented as a direct sum of sub-algebras . For outer algebra this follows directly from the definition. The outer powers are the corresponding sub-algebras.

Now be a basis of the -dimensional vector space . Then ${\ displaystyle e_ {1}, \ dotsc, e_ {n}}$${\ displaystyle n}$${\ displaystyle V}$

${\ displaystyle \ {\, e_ {i_ {1}} \ wedge \ dotsb \ wedge e_ {i_ {k}} \, | \, i_ {1} <\ dotsb

a base of . The dimension is . In particular , if . ${\ displaystyle \ Lambda ^ {k} (V)}$${\ displaystyle \ dim (\ Lambda ^ {k} (V)) = {\ tbinom {n} {k}}}$${\ displaystyle \ dim (\ Lambda ^ {k} (V)) = 0}$${\ displaystyle k> n}$

The basis of external algebra is then obtained by combining the bases of all degrees. The following then applies to the dimension

${\ displaystyle \ dim (\ Lambda (V)) = \ sum _ {i = 0} ^ {n} {\ binom {n} {i}} = 2 ^ {n},}$

where denotes the binomial coefficient . It follows that each element of Graßmann algebra can be represented as ${\ displaystyle {\ tbinom {n} {i}}}$

${\ displaystyle \ sum _ {I \ subseteq \ {1, \ dotsc, n \}} f_ {I} \, e_ {I},}$

where the coefficients characterize the element with respect to a basis and is with . ${\ displaystyle 2 ^ {n}}$${\ displaystyle f_ {I}}$${\ displaystyle e_ {1}, \ dotsc, e_ {n}}$${\ displaystyle e_ {I}: = e_ {m_ {1}} \ wedge \ dotsb \ wedge e_ {m_ {k}}}$${\ displaystyle I = \ {m_ {1}, \ dotsc, m_ {k} \}; \, i

As an example one can choose the vector space with the canonical basis . The 3rd degree of external algebra is spanned by: ${\ displaystyle \ mathbb {R} ^ {4}}$${\ displaystyle \ Lambda (\ mathbb {R} ^ {4})}$

${\ displaystyle \ Lambda ^ {3} (\ mathbb {R} ^ {4}) = \ operatorname {span} (\ {(e_ {1} \ wedge e_ {2} \ wedge e_ {3})), (e_ {1} \ wedge e_ {2} \ wedge e_ {4}), (e_ {1} \ wedge e_ {3} \ wedge e_ {4}), (e_ {2} \ wedge e_ {3} \ wedge e_ {4}) \})}$

By counting you can see that it is. ${\ displaystyle \ dim (\ Lambda ^ {3} (\ mathbb {R} ^ {4})) = 4}$

### Universal property

If a vector space (or module) and an associative algebra , there is a bijection between ${\ displaystyle V}$${\ displaystyle A}$

• the homomorphisms of vector spaces (or modules) , so that for all applicable${\ displaystyle f \ colon V \ to A}$${\ displaystyle f (v) ^ {2} = 0}$${\ displaystyle v \ in V}$

and

• the algebra homomorphisms .${\ displaystyle \ Lambda (V) \ to A}$

## Scalar product

If the vector space has a scalar product, the outer algebra can also be equipped with one. Subspaces of various degrees are defined as orthogonal . Within a subspace it is sufficient to define the scalar product on pure products. Be in and pure products . The Gram's matrix of scalar products can be assigned to them. Then the scalar product can be defined as the determinant of the Gram's matrix: ${\ displaystyle V}$${\ displaystyle a_ {1} \ wedge \ dots \ wedge a_ {m}}$${\ displaystyle b_ {1} \ wedge \ dots \ wedge b_ {m}}$${\ displaystyle \ Lambda ^ {m} V}$

${\ displaystyle \ langle a_ {1} \ wedge \ dots \ wedge a_ {m}, \, b_ {1} \ wedge \ dots \ wedge b_ {m} \ rangle: = \ det {\ begin {pmatrix} \ langle a_ {1}, b_ {1} \ rangle & \ dots & \ langle a_ {1}, b_ {m} \ rangle \\\ vdots && \ vdots \\\ langle a_ {m}, b_ {1} \ rangle & \ dots & \ langle a_ {m}, b_ {m} \ rangle \ end {pmatrix}}}$

If the -dimensional column vector space , then the matrix can be defined. From this one can consider the maximum square sub-matrices . A multi-index is off ${\ displaystyle V}$${\ displaystyle n}$${\ displaystyle a_ {1} \ wedge \ dots \ wedge a_ {m}}$${\ displaystyle A = (a_ {1}, \ dots, a_ {m})}$ ${\ displaystyle A _ {\ alpha}}$${\ displaystyle \ alpha}$

${\ displaystyle I_ {m}: = \ {\ alpha \ in \ mathbb {N} ^ {m}: \; 1 \ leq \ alpha (1) <\ dots <\ alpha (m) \ leq n \}}$

and consists of exactly these lines of . ${\ displaystyle A _ {\ alpha}}$${\ displaystyle A}$

The following identity applies according to the Binet-Cauchy theorem , in the case and also called "surface pythagoras": ${\ displaystyle m = 2}$${\ displaystyle A = B}$

${\ displaystyle \ det (\; (\ langle a_ {i}, b_ {k} \ rangle) \;) = \ det (A ^ {t} B) = \ sum _ {\ alpha \ in I_ {m} } \ det A _ {\ alpha} \ cdot \ det B _ {\ alpha}}$

## Differential forms

The main field of application of external algebra is in differential geometry. Let be a -dimensional differentiable manifold . So one chooses the cotangent space of this manifold as the underlying vector space and forms the outer algebra. A differential form is a cut in the bundle of these vector spaces, i.e. a mapping that assigns an element of the outer algebra to each point of the manifold over the cotangent space at this point. These forms have the great advantage that they can be used to integrate on a manifold independently of the map . ${\ displaystyle M}$${\ displaystyle n}$

## Hodge operator

Let (as above) be a vector space and be the outer algebra of . Furthermore, be oriented and provided with a scalar product. The Hodge operator or Hodge star operator is a natural isomorphism . The Hodge operator assigns everyone in a unique way , the so-called "dual element" . Is an oriented basis of , then is unambiguous by the formula ${\ displaystyle V}$${\ displaystyle \ Lambda ^ {n} V}$${\ displaystyle V}$${\ displaystyle V}$${\ displaystyle *: \ Lambda ^ {k} V \ rightarrow \ Lambda ^ {nk} V}$${\ displaystyle \ omega \ in \ Lambda ^ {k} V}$${\ displaystyle * \ omega \ in \ Lambda ^ {nk} V}$${\ displaystyle \ omega}$${\ displaystyle (e_ {1}, \ dots, e_ {n})}$${\ displaystyle V}$${\ displaystyle * \ omega}$

${\ displaystyle \ forall \ eta \ in \ Lambda ^ {k} V: \; \ eta \ wedge * \ omega = \ langle \ eta, \ omega \ rangle \ cdot e_ {1} \ wedge \ dots \ wedge e_ { n},}$

set. For example, if there is also an orthonormal basis, ${\ displaystyle (e_ {1}, \ dots, e_ {n})}$

${\ displaystyle * (e_ {1} \ wedge \ dots \ wedge e_ {k}) = e_ {k + 1} \ wedge \ dots \ wedge e_ {n}}$

for (whereby the empty product, for or , is to be interpreted as 1). The Hodge operator can thus be understood as an algebraic generalization of the geometric concept of the orthogonal complement of subspaces of . ${\ displaystyle k = 0, \ dots, n}$${\ displaystyle k = 0}$${\ displaystyle k = n}$${\ displaystyle V}$

### Relationship to the cross product and the late product (Hodge duality of vectors) and concepts of physics

Let be the canonical basis of and be two elements from the outer algebra (or outer power) of the real vector space. The Hodge operator is referred to with respect to the standard (Euclidean) scalar product and the standard orientation. For the outer product of and , with the help of the distributive law ${\ displaystyle \ mathbf {e} _ {1}, \ mathbf {e} _ {2}, \ mathbf {e} _ {3}}$${\ displaystyle \ mathbb {R} ^ {3}}$${\ displaystyle \ alpha = a_ {1} \ mathbf {e} _ {1} + a_ {2} \ mathbf {e} _ {2} + a_ {3} \ mathbf {e} _ {3}, \ beta = b_ {1} \ mathbf {e} _ {1} + b_ {2} \ mathbf {e} _ {2} + b_ {3} \ mathbf {e} _ {3} \ in \ Lambda ^ {1} (\ mathbb {R} ^ {3})}$${\ displaystyle *}$${\ displaystyle \ alpha}$${\ displaystyle \ beta}$

${\ displaystyle {\ begin {array} {rl} * (\ alpha \ wedge \ beta) = & * ((a_ {1} \ mathbf {e} _ {1} + a_ {2} \ mathbf {e} _ {2} + a_ {3} \ mathbf {e} _ {3}) \ wedge (b_ {1} \ mathbf {e} _ {1} + b_ {2} \ mathbf {e} _ {2} + b_ {3} \ mathbf {e} _ {3})) \\ [0.5em] = & * ((a_ {2} \ mathbf {e} _ {2} \ wedge b_ {1} \ mathbf {e} _ {1}) + (a_ {3} \ mathbf {e} _ {3} \ wedge b_ {1} \ mathbf {e} _ {1}) + (a_ {1} \ mathbf {e} _ {1} \ wedge b_ {2} \ mathbf {e} _ {2}) \\ & + (a_ {3} \ mathbf {e} _ {3} \ wedge b_ {2} \ mathbf {e} _ {2}) + (a_ {1} \ mathbf {e} _ {1} \ wedge b_ {3} \ mathbf {e} _ {3}) + (a_ {2} \ mathbf {e} _ {2} \ wedge b_ { 3} \ mathbf {e} _ {3})) \\ [0.5em] = & * ((a_ {1} b_ {2} -a_ {2} b_ {1}) (\ mathbf {e} _ { 1} \ wedge \ mathbf {e} _ {2}) + (a_ {2} b_ {3} -a_ {3} b_ {2}) (\ mathbf {e} _ {2} \ wedge \ mathbf {e } _ {3}) + (a_ {3} b_ {1} -a_ {1} b_ {3}) (\ mathbf {e} _ {3} \ wedge \ mathbf {e} _ {1})) \ ,. \ end {array}}}$

The Hodge operator assigns the vector to the product of the basis vectors in three-dimensional space . The other base vectors are assigned by cyclically swapping the indices. This results in the cross product in three-dimensional real space. So one can understand the external algebra as a generalization of the cross product. With the help of this generalization, the differential operator rotation, known from vector analysis, can also be generalized to the -dimensional case. ${\ displaystyle \ mathbf {e} _ {1} \ wedge \ mathbf {e} _ {2}}$${\ displaystyle \ mathbf {e} _ {3}}$${\ displaystyle * (\ alpha \ wedge \ beta)}$ ${\ displaystyle \ operatorname {red}}$${\ displaystyle n}$

The late product of three vectors im can be understood as an element of the third outer power. Note that the Hodge star operator is only defined in terms of a scalar product and an orientation. The external product, on the other hand, can be defined independently of such a choice. ${\ displaystyle a, b, c}$${\ displaystyle \ mathbb {R} ^ {3}}$${\ displaystyle a \ wedge b \ wedge c}$

Quantities originating from classical physics, which are called pseudovectors in physics , such as a magnetic field strength or an angular momentum , can be understood as elements of . In many cases, a pseudoscalar is a quantity that can be understood as an element of . ${\ displaystyle \ Lambda ^ {2} (\ mathbb {R} ^ {3})}$${\ displaystyle \ Lambda ^ {3} (\ mathbb {R} ^ {3})}$

### Relation to determinant theory; Expansion of m -vectors

The concept of duality associated with the Hodge operator is even simpler for scalars: These are dual to the determinant of a matrix. In detail: ${\ displaystyle n \ times n}$

The same requirements should apply as in the previous section; only be allowed now , and if now, for a -leg the form is given ( i.e. a sum of elementary -legs), then as above results the antisymmetrized product , except for an alternating sign that depends on the respective orientation ( “Right-handedness” versus “left-handedness”), the hypersurface dimension of the -leg dual to the respective “base direction”, i.e. its -dimensional “volume” in or at the same time, this expression represents a sub-determinant of a matrix with columns and rows Manner, namely because of the multilinearity and multi-associativity of the given expression, the well-known determinant expansion theorems. In particular, the volume measure generated in this way (= base area times height) of the respective parallel epiped is invariant to displacements parallel to the base area, because determinants of linearly dependent vectors vanish. ${\ displaystyle m \ geq 3}$${\ displaystyle n \ geq m \ ,.}$${\ displaystyle 1 \ leq i _ {\ nu} \ leq n \ ,,}$${\ displaystyle m}$${\ displaystyle \ textstyle \ gamma: = \ sum _ {\, i_ {1} ${\ displaystyle \ textstyle {\ binom {n} {m}}}$${\ displaystyle m}$${\ displaystyle (a_ {i_ {1}} ^ {(1)} a_ {i_ {2}} ^ {(2)} \ ldots a_ {i_ {m}} ^ {(m)}) _ {\, asy}}$${\ displaystyle m}$${\ displaystyle m}$${\ displaystyle \ mathbb {R} ^ {n}}$${\ displaystyle \ mathbb {C} ^ {n} \ ,.}$${\ displaystyle m}$${\ displaystyle n}$

## Relationship to Clifford Algebra

Be a symmetric form on . ${\ displaystyle q \ colon V \ times V \ to K}$${\ displaystyle V}$

Now let the two-digit, bilinear link be

${\ displaystyle \ circ: \ Lambda (V) \ times \ Lambda (V) \ to \ Lambda (V)}$

defined by

{\ displaystyle {\ begin {aligned} & (v_ {1} \ wedge \ cdots \ wedge v_ {i}) \ circ (w_ {1} \ wedge \ cdots \ wedge w_ {j}) \\ = & v_ {1 } \ wedge \ cdots \ wedge v_ {i} \ wedge w_ {1} \ wedge \ cdots \ wedge w_ {j} \\ + & \ sum _ {k = 1} ^ {\ min \ {i, j \} } \ sum _ {\ overset {1 \ leq m_ {1} <\ cdots

for . The hats over the factors mean that they are omitted from the product. By introducing this new link as a multiplication, one obtains the Clifford algebra . In particular, with the zero bilinear form, the Graßmann algebra is obtained again: because the additional term in the above equation is omitted and therefore applies. ${\ displaystyle v_ {m}, w_ {n} \ in V}$ ${\ displaystyle \ mathrm {Cl} (V, q)}$${\ displaystyle \ mathrm {Cl} (V, 0) = \ Lambda (V)}$${\ displaystyle \ circ = \ wedge}$

For simple definitions, the above definition means the elementary relationship ${\ displaystyle v, w \ in V}$

${\ displaystyle v \ circ w: = v \ wedge w + v \ cdot w}$,

according to which the "geometric" product of two vectors can be broken down into an antisymmetric wedge product and a symmetric scalar product component. The sum is defined here in Graßmann algebra, the sign being a matter of convention. ${\ displaystyle \ circ}$${\ displaystyle v \ cdot w: = - q (v, w)}$

2. and thus result in dual -legs.${\ displaystyle p = m}$${\ displaystyle p = nm}$${\ displaystyle p}$
4. These are so-called "shearings", e.g. B. Transformations with${\ displaystyle a_ {n} \ to a_ {n} + \ lambda a_ {i} \ ,,}$${\ displaystyle i \ leq (n-1) \ ,.}$
5. Precise applies to the degree of expansion of -Beins  : . This is again a "generalized Pythagorean theorem ."${\ displaystyle m}$${\ displaystyle \ gamma}$${\ displaystyle V (\ gamma) = {\ sqrt {\ sum _ {i_ {1} <\ ldots