Tensor product

The tensor product is a very versatile term in mathematics : in linear algebra and differential geometry it is used to describe multilinear mappings, in commutative algebra and algebraic geometry it corresponds on the one hand to the restriction of geometric structures to subsets and on the other hand to the Cartesian product of geometric ones Objects.

In physics one denotes elements of the tensor product

${\ displaystyle \ underbrace {V \ otimes \ dotsb \ otimes V} _ {r {\ text {factors}}} \ otimes \ underbrace {V ^ {*} \ otimes \ dotsb \ otimes V ^ {*}} _ { s {\ text {factors}}}}$

(for a vector space with dual space , often ) as tensors , contravariant of level and covariant of level . In short, one speaks of type tensors . ${\ displaystyle V}$ ${\ displaystyle V ^ {*}}$${\ displaystyle V = \ mathbb {R} ^ {3}}$ ${\ displaystyle r}$${\ displaystyle s}$${\ displaystyle (r, s)}$

This article describes the mathematical and coordinate-free aspects of the tensor product. For individual tensors and coordinate representations see Tensor .

Tensor product of vector spaces

For motivation

In quantum mechanics , the state space of an object is a Hilbert space . If one has particles with states in Hilbert spaces and now looks at the states of the system formed from the particles , there are first of all the states that summarize the information contained in the states of these particles, each individually, and the product states is called. Since quantum mechanics requires that every superposition of states of an object (here ) is again a possible state of the object, the mathematical model must also contain any linear combinations in addition to the products mentioned, which then together form the Hilbert space of the system . The new vector space is denoted by and called the tensor product . The needs of physics and, on the part of mathematics, the desire to keep the construction as simple as possible lead to the definition given below. The scalar product of the Hilbert space is initially not taken into account as an additional structure. ${\ displaystyle n}$${\ displaystyle z_ {1}, \, \ dotsc, \, z_ {n}}$${\ displaystyle H_ {1}, \, \ dotsc, \, H_ {n}}$${\ displaystyle S}$${\ displaystyle z_ {i} \ in H_ {i}}$${\ displaystyle z_ {1} \ cdot \ dotsb \ cdot z_ {n}}$${\ displaystyle S}$${\ displaystyle S}$${\ displaystyle H_ {1} \ otimes \, \ dotsb \, \ otimes H_ {n}}$

definition

If and are two vector spaces over a common scalar field , then is the tensor product${\ displaystyle V}$${\ displaystyle W}$ ${\ displaystyle K}$

${\ displaystyle V \ otimes W}$

a vector space that can be constructed as follows: If there is a basis of and a basis of , then is a vector space, called a tensor product space, in which there is a basis that is uniquely related to the ordered pairs of the Cartesian product ${\ displaystyle E = \ {e_ {i} \ mid i \ in I \}}$${\ displaystyle V}$${\ displaystyle F = \ {f_ {j} \ mid j \ in J \}}$${\ displaystyle W}$${\ displaystyle V \ otimes W}$

${\ displaystyle E \ times F = \ {(e_ {i}, f_ {j}) \ mid i \ in I, j \ in J \}}$

the bases of the exit rooms can be identified. The dimension of is therefore equal to the product of the dimensions of and . The element of this basis that corresponds to the ordered pair is noted as. The symbol has no deeper meaning so far. Any element of the tensor product then has the form ${\ displaystyle V \ otimes W}$${\ displaystyle V}$${\ displaystyle W}$${\ displaystyle (e_ {i}, f_ {j})}$${\ displaystyle e_ {i} \ otimes f_ {j}}$${\ displaystyle \ otimes}$${\ displaystyle V \ otimes W}$

${\ displaystyle \ sum _ {(i, j) \ in I_ {0} \ times J_ {0}} c_ {ij} \; (e_ {i} \ otimes f_ {j}),}$

wherein and finite subsets of the index sets and are and for each and applies. ${\ displaystyle I_ {0} \ subset I}$${\ displaystyle J_ {0} \ subset J}$${\ displaystyle I}$${\ displaystyle J}$${\ displaystyle c_ {ij} \ in K}$${\ displaystyle i \ in I_ {0}}$${\ displaystyle j \ in J_ {0}}$

It is now possible using this basis, a product of vectors and define which will be recorded with the same shortcut icon. Naturally, it is the product of two basis vectors and precisely the basis vector that was designated with . The product of any vector can now be obtained by bilinear continuation, ${\ displaystyle V}$${\ displaystyle W}$${\ displaystyle e_ {i} \ in E \ subset V}$${\ displaystyle f_ {j} \ in F \ subset W}$${\ displaystyle e_ {i} \ otimes f_ {j} \ in V \ otimes W}$

${\ displaystyle v = \ sum _ {i \ in I_ {0}} a_ {i} e_ {i} \ in V}$

and

${\ displaystyle w = \ sum _ {j \ in J_ {0}} b_ {j} f_ {j} \ in W}$

with finite subsets the product becomes ${\ displaystyle I_ {0} \ subset I, \; J_ {0} \ subset J}$

${\ displaystyle v \ otimes w = \ sum _ {(i, j) \ in I_ {0} \ times J_ {0}} a_ {i} b_ {j} \; (e_ {i} \ otimes f_ {j })}$

assigned.

Finite-dimensional case

For finite-dimensional vector spaces with basis and with basis the tensor product can be constructed directly as a space of matrices . The rows are numbered with the base index of , the columns with the base index of . The tensor product of two vectors , is that matrix whose entry at the location of the -th coordinate of relative multiplied by the -th coordinate of with respect to is. In the language of matrices, this construction is also called the dyadic product of the coordinate vectors. ${\ displaystyle V}$${\ displaystyle B = (e_ {1}, \ dotsc, e_ {m})}$${\ displaystyle W}$${\ displaystyle C = (f_ {1}, \ dotsc, f_ {n})}$${\ displaystyle I = \ {1, \ dotsc, m \}}$${\ displaystyle V}$${\ displaystyle J = \ {1, \ dotsc, n \}}$${\ displaystyle W}$${\ displaystyle v \ in V}$${\ displaystyle w \ in W}$${\ displaystyle (i, j)}$${\ displaystyle i}$${\ displaystyle v}$${\ displaystyle B}$${\ displaystyle j}$${\ displaystyle w}$${\ displaystyle C}$

properties

For the tensor product of vectors, the following rules apply to all and and : ${\ displaystyle v, v ', v' '\ in V}$${\ displaystyle w, w ', w' '\ in W}$${\ displaystyle \ lambda \ in K}$

 ${\ displaystyle (v '+ v' ') \ otimes w = v' \ otimes w + v '' \ otimes w}$ (1) ${\ displaystyle v \ otimes (w '+ w' ') = v \ otimes w' + v \ otimes w ''}$ (2) ${\ displaystyle (\ lambda v) \ otimes w = \ lambda \ cdot (v \ otimes w) = v \ otimes (\ lambda w)}$ (3)

In other words: the picture ; is - bilinear . These rules look like distributive or associative laws , which motivates the name tensor product . ${\ displaystyle \ otimes \ colon V \ times W \ to V \ otimes W}$${\ displaystyle (v, w) \ mapsto v \ otimes w}$${\ displaystyle K}$

A commutative law does not hold in general, because the tensors belong to ${\ displaystyle v \ in V, w \ in W}$

${\ displaystyle v \ otimes w \ in V \ otimes W}$ and ${\ displaystyle w \ otimes v \ in W \ otimes V}$

to the same vector space only if the spaces and are identical. However, in this case too the tensors and in general are different. ${\ displaystyle V}$${\ displaystyle W}$${\ displaystyle v \ otimes w}$${\ displaystyle w \ otimes v}$

Tensor product of vector spaces with linear mappings

Given two vector spaces each with a linear mapping to a further vector space, then there is a clearly determined linear mapping that continues to a mapping . This mapping is written with the same linking symbol as and is called the tensor product of and In the sense of category theory , the pair of maps forms ${\ displaystyle V, \, W,}$${\ displaystyle \ lambda \ colon V \ to V ', \, \, \ mu \ colon W \ to W'.}$${\ displaystyle \ kappa \ colon V \ otimes W \ to V '\ otimes W',}$${\ displaystyle v \ otimes w \ mapsto \, \ lambda (v) \ otimes \ mu (w) \, \, (v \ in V, \, w \ in W)}$${\ displaystyle \ kappa \ colon V \ otimes W \ to V '\ otimes W'}$${\ displaystyle \ kappa = \ lambda \ otimes \ mu}$${\ displaystyle \ lambda}$${\ displaystyle \ mu.}$

${\ displaystyle \ otimes \ colon (V, W) \ mapsto V \ otimes W, \, \, (\ lambda, \ mu) \ mapsto \ lambda \ otimes \ mu}$

a functor . It is common to designate the figures combined in a functor with the same symbol (here ). ${\ displaystyle \ otimes}$

The construction is based on bases of and the form (s. O.) A base of the demand on the elements of this base uniquely defines a linear map This is true for the other elements of not only for the pairs from the representations of and as and results namely ${\ displaystyle e_ {i}, \, f_ {j}}$${\ displaystyle V}$${\ displaystyle W.}$${\ displaystyle e_ {i} \ otimes f_ {j}}$${\ displaystyle V \ otimes W.}$${\ displaystyle \ kappa (e_ {i} \ otimes f_ {j}) = \ lambda (e_ {i}) \ otimes \ mu (f_ {j})}$${\ displaystyle \ kappa \ colon V \ otimes W \, \ to \, V '\ otimes W'.}$${\ displaystyle \ kappa (v \ otimes w) = \ lambda (v) \ otimes \ mu (w)}$${\ displaystyle V \ times W,}$${\ displaystyle (e_ {i}, \, f_ {j}).}$${\ displaystyle v}$${\ displaystyle w}$${\ displaystyle \ textstyle v = \ sum _ {i} a_ {i} e_ {i}}$${\ displaystyle \ textstyle w = \ sum _ {j} b_ {j} f_ {j}}$

${\ displaystyle \ kappa (v \ otimes w) = \ sum _ {i} \ sum _ {j} a_ {i} b_ {j} \ kappa (e_ {i} \ otimes f_ {j}) = \ sum _ {i} \ sum _ {j} a_ {i} b_ {j} \ lambda (e_ {i}) \ otimes \ mu (f_ {j}) = \ sum _ {i} a_ {i} \ lambda (e_ {i}) \, \ otimes \ sum _ {j} b_ {j} \ mu (f_ {j}) = \ lambda (v) \ otimes \ mu (w).}$

If you start the construction with other bases of and thus define a linear mapping by, then the following results are obtained: The mappings and match on the elements of a base of , are therefore identical. The construction of is independent of the choice of bases.${\ displaystyle e_ {i} ^ {\ ast}, \, \, f_ {j} ^ {\ ast}}$${\ displaystyle V}$${\ displaystyle W,}$${\ displaystyle \ kappa ^ {\ ast}}$${\ displaystyle \ kappa ^ {\ ast} (e_ {i} ^ {\ ast} \ otimes f_ {j} ^ {\ ast}) = \ lambda (e_ {i} ^ {\ ast}) \ otimes \ mu (f_ {j} ^ {\ ast}),}$${\ displaystyle \ lambda (e_ {i} ^ {\ ast}) \ otimes \ mu (f_ {j} ^ {\ ast}) = \ kappa (e_ {i} ^ {\ ast} \ otimes f_ {j} ^ {\ ast}).}$${\ displaystyle \ kappa}$${\ displaystyle \ kappa ^ {\ ast}}$${\ displaystyle V \ otimes W}$${\ displaystyle \ lambda \ otimes \ mu}$

Universal definition

So far, the question of what nature the vector space denoted by is in the general case has been avoided . The previously stated requirements for this vector space can be shortened, but clearly stated in the form of a universal definition . ${\ displaystyle V \ otimes W}$

Every -Vector space for which there is a bilinear mapping which satisfies the following universal property is called the tensor product of the -vector spaces and : ${\ displaystyle K}$${\ displaystyle V}$${\ displaystyle W}$${\ displaystyle K}$${\ displaystyle Z}$ ${\ displaystyle \ phi \ colon V \ times W \ to Z}$

Every further bilinear mapping into a vector space is linearly uniquely factored over . More precisely, this means that there is a uniquely determined linear mapping such that for any pair of vectors: ${\ displaystyle B \ colon V \ times W \ to X}$${\ displaystyle K}$${\ displaystyle X}$${\ displaystyle \ phi}$ ${\ displaystyle {\ tilde {B}} \ colon Z \ to X}$
${\ displaystyle B = {\ tilde {B}} \ circ \ phi,}$
so
${\ displaystyle B (v, w) = {\ tilde {B}} (\ phi (v, w)).}$

If there is such a vector space, it is uniquely determined except for isomorphism, i.e. H. for every other bilinear mapping with the universal property there is an isomorphism such that . It is recorded and . The universal property can therefore be written as, one often dispenses with the assignment of different names, since the domain of definition can be read from the argument. ${\ displaystyle \ phi '\ colon V \ times W \ to Z'}$${\ displaystyle k \ colon Z \ to Z '}$${\ displaystyle \ phi '= k \ circ \ phi}$${\ displaystyle Z = V \ otimes W}$${\ displaystyle \ phi (v, w) = v \ otimes w}$${\ displaystyle B (v, w) = {\ tilde {B}} (v \ otimes w)}$

In order to actually specify vector spaces that meet this definition, there are two usual ways: on the one hand in the finite-dimensional case via the space of the bilinear forms on the dual spaces, as indicated below, and on the other hand through the construction of a space that is easy to specify but too large, from which a quotient space receives the properties of the tensor product according to a suitable subspace. The latter construction is carried out in the article Tensor product of modules .

Natural homomorphisms

From the universal definition it follows that the vector space of the bilinear mappings is canonically isomorphic to the vector space of the linear mappings : ${\ displaystyle B (V, W; X)}$${\ displaystyle V \ times W \ to X}$${\ displaystyle L (V \ otimes W, X)}$${\ displaystyle V \ otimes W \ to X}$

Let it be a bilinear mapping. Then you can show that through ${\ displaystyle B \ colon V \ times W \ to X}$

${\ displaystyle V \ otimes W \ to X, \ qquad v \ otimes w \ mapsto B (v, w)}$

a linear mapping is defined.

Is the other way around

${\ displaystyle \ lambda \ colon V \ otimes W \ to X}$

a linear mapping, so is the mapping

${\ displaystyle V \ times W \ to X, \ qquad (v, w) \ mapsto \ lambda (v \ otimes w)}$

bilinear.

Furthermore there is a natural monomorphism defined by . This is an isomorphism if and only if or is finite-dimensional. ${\ displaystyle L (V, X) \ otimes L (W, Y) \ to L (V \ otimes W, X \ otimes Y)}$${\ displaystyle (f \ otimes g) (v \ otimes w): = f (v) \ otimes g (w)}$${\ displaystyle V}$${\ displaystyle W}$

By currying you will also receive an isomorphism . ${\ displaystyle B (V, W; X) \ cong L (V, L (W, X))}$

For finite-dimensional vector spaces and therefore holds ${\ displaystyle X = Y = K}$

${\ displaystyle V ^ {*} \ otimes W ^ {*} \ cong (V \ otimes W) ^ {*} \ cong B (V, W; K) \ cong L (V, W ^ {*}), }$

where z. B. the dual space of is and the isomorphism is used. In general , defined by , is an isomorphism of vector spaces. ${\ displaystyle V ^ {*}}$${\ displaystyle V}$${\ displaystyle K \ otimes K \ cong K}$${\ displaystyle K \ otimes V \ to V}$${\ displaystyle c \ otimes v \ mapsto cv}$

If you replace with your dual space and use the natural identification with the dual space , you get an isomorphism , defined by . In the event that both vector spaces are infinitely dimensional, there is only one natural monomorphism. ${\ displaystyle W}$${\ displaystyle W \ cong W ^ {**}}$${\ displaystyle V ^ {*} \ otimes W \ to L (V, W)}$${\ displaystyle (f \ otimes w) (v): = f (v) w}$

Tensor product and bilinear forms

It follows from the universal definition . In the case of finite-dimensional vector spaces, the tensor product of and can also be defined as the dual space of the vector space of all bilinear mappings . ${\ displaystyle (V \ otimes W) ^ {*} \ cong B (V, W; K)}$${\ displaystyle V}$${\ displaystyle W}$${\ displaystyle V \ times W \ to K}$

One reason why one does not work with the space of bilinear forms instead of the tensor product is: multilinear forms, for example maps

${\ displaystyle U \ times V \ times W \ to K}$

for three -Vektorräume , , which are linear in each component corresponding linear maps ${\ displaystyle K}$${\ displaystyle U}$${\ displaystyle V}$${\ displaystyle W}$

${\ displaystyle U \ otimes V \ otimes W \ to K,}$

but there is no similarly simple way of expressing spaces of multilinear forms by spaces of bilinear forms; designated thereby

${\ displaystyle U \ otimes V \ otimes W}$

the rooms

${\ displaystyle U \ otimes (V \ otimes W)}$ or. ${\ displaystyle (U \ otimes V) \ otimes W,}$

which with the help of

${\ displaystyle u \ otimes (v \ otimes w) = (u \ otimes v) \ otimes w}$

can be identified canonically. This identification corresponds to the fact that one comes from a multilinear form

${\ displaystyle U \ times V \ times W \ to K}$

on the one hand by holding on to the argument from a bilinear form ${\ displaystyle U}$

${\ displaystyle V \ times W \ to K,}$

on the other hand, by retaining the argument from a bilinear form ${\ displaystyle W}$

${\ displaystyle U \ times V \ to K}$

can get.

Expansion of the scalars

If there is a vector space over and an extension field of , one can get the tensor product ${\ displaystyle V}$${\ displaystyle K}$${\ displaystyle L}$${\ displaystyle K}$

${\ displaystyle V_ {L}: = V \ otimes _ {K} L}$

form, by also interpreting as -vector space; this is symbolized by. becomes a vector space over if one ${\ displaystyle L}$${\ displaystyle K}$${\ displaystyle \ otimes _ {K}}$${\ displaystyle V_ {L}}$${\ displaystyle L}$

${\ displaystyle \ lambda \ cdot (v \ otimes \ mu): = v \ otimes (\ lambda \ mu) \ qquad \ mathrm {f {\ ddot {u}} r} \ v \ in V, \, \ lambda , \ mu \ in L}$

puts. The dimension of as- vector space is equal to the dimension of as- vector space: If there is a base of , then the set forms ${\ displaystyle V_ {L}}$${\ displaystyle L}$${\ displaystyle V}$${\ displaystyle K}$${\ displaystyle B}$${\ displaystyle K}$${\ displaystyle V}$

${\ displaystyle \ {b \ otimes 1 \, | \, b \ in B \}}$

a base of . ${\ displaystyle L}$${\ displaystyle V_ {L}}$

Tensor product of representations

Be

${\ displaystyle \ rho _ {1} \ colon G_ {1} \ to {\ text {GL}} (V _ {\ rho _ {1}}), \, \ rho _ {2} \ colon G_ {2} \ to {\ text {GL}} (V _ {\ rho _ {2}})}$

linear representations. We define the linear representation

${\ displaystyle \ rho _ {1} \ otimes \ rho _ {2} \ colon G_ {1} \ times G_ {2} \ to {\ text {GL}} (V _ {\ rho _ {1}} \ otimes V _ {\ rho _ {2}}})}$

into the tensor product of and through ${\ displaystyle V _ {\ rho _ {1}}}$${\ displaystyle V _ {\ rho _ {2}}}$

${\ displaystyle \ rho _ {1} \ otimes \ rho _ {2} (s_ {1}, s_ {2}) = \ rho _ {1} (s_ {1}) \ otimes \ rho _ {2} ( s_ {2}),}$

for , where the tensor product of matrices is the Kronecker product . This representation is called the outer tensor product of the representations and . Existence and uniqueness follow from the properties of the tensor product . ${\ displaystyle s_ {1} \ in G_ {1}, s_ {2} \ in G_ {2}}$${\ displaystyle \ rho _ {1}}$${\ displaystyle \ rho _ {2}}$

Be and two linear representations of the same group and then be can ${\ displaystyle \ rho _ {1} \ colon G \ to {\ text {GL}} (V _ {\ rho _ {1}})}$${\ displaystyle \ rho _ {2} \ colon G \ to {\ text {GL}} (V _ {\ rho _ {2}})}$${\ displaystyle s \ in G,}$

${\ displaystyle \ rho (s) \ in {\ text {GL}} (V _ {\ rho _ {1}} \ otimes V _ {\ rho _ {2}})}$

can be defined by

${\ displaystyle \ rho (s) (v_ {1} \ otimes v_ {2}) = \ rho _ {1} (s) v_ {1} \ otimes \ rho _ {2} (s) v_ {2}, }$

for One writes for it The mapping then defines a linear representation of which is also called the tensor product of the given representations. ${\ displaystyle v_ {1} \ in V _ {\ rho _ {1}}, v_ {2} \ in V _ {\ rho _ {2}}.}$${\ displaystyle \ rho (s) = \ rho _ {1} (s) \ otimes \ rho _ {2} (s).}$${\ displaystyle s \ mapsto \ rho (s)}$${\ displaystyle G,}$

However, one must strictly distinguish between these two cases. The first case is a representation of the product of two groups in the tensor product of the corresponding representation spaces. The second case is a representation of a group in the tensor product of representation spaces of this group. However, the second case can be viewed as a special case of the first case by considering the diagonal subgroup . The definitions can finally be iterated often. ${\ displaystyle G}$${\ displaystyle G \ times G}$

Be and representations of the group then is a representation as through identification ${\ displaystyle V}$${\ displaystyle W}$${\ displaystyle G,}$${\ displaystyle {\ text {Hom}} (V, W)}$

${\ displaystyle {\ text {Hom}} (V, W) = V ^ {*} \ otimes W}$

can be seen. Let and be the representation on the representation on the representation on Then the above identification yields the equation ${\ displaystyle B \ in {\ text {Hom}} (V, W)}$${\ displaystyle \ rho}$${\ displaystyle {\ text {Hom}} (V, W),}$ ${\ displaystyle \ rho _ {V}}$${\ displaystyle V,}$ ${\ displaystyle \ rho _ {W}}$${\ displaystyle W.}$

${\ displaystyle \ rho (s) (B) v = \ rho _ {W} (s) \ circ B \ circ \ rho _ {V} (s) (v)}$

for all ${\ displaystyle s \ in G, v \ in V.}$

The irreducible representations of are, apart from isomorphism, exactly the representations for which and are the irreducible representations of and are. ${\ displaystyle G_ {1} \ times G_ {2}}$${\ displaystyle \ rho _ {1} \ otimes \ rho _ {2}}$${\ displaystyle \ rho _ {1}}$${\ displaystyle \ rho _ {2}}$${\ displaystyle G_ {1}}$${\ displaystyle G_ {2}}$

This result limits the study of the representations of FIG. 3 to the study of the representations of and . ${\ displaystyle G_ {1} \ times G_ {2}}$${\ displaystyle G_ {1}}$${\ displaystyle G_ {2}}$

The following example illustrates the difference between direct sum and tensor product. Be

${\ displaystyle \ textstyle \ rho _ {1} \ colon \ mathbb {Z} / 2 \ mathbb {Z} \ to {\ text {GL}} _ {2} (\ mathbb {C})}$

the linear representation given by

${\ displaystyle \ rho _ {1} ({\ overline {1}}) = \ left ({\ begin {array} {cc} 0 & -i \\ i & 0 \ end {array}} \ right).}$

And be

${\ displaystyle \ textstyle \ rho _ {2}: \ mathbb {Z} / 3 \ mathbb {Z} \ to {\ text {GL}} _ {3} (\ mathbb {C})}$

the linear representation given by

${\ displaystyle \ rho _ {2} ({\ overline {1}}) = \ left ({\ begin {array} {ccc} 1 & 0 & e ^ {\ frac {2 \ pi i} {3}} \\ 0 & e ^ {\ frac {2 \ pi i} {3}} & 0 \\ 0 & 0 & e ^ {\ frac {4 \ pi i} {3}} \ end {array}} \ right).}$

Then is the outer tensor product

${\ displaystyle \ rho _ {1} \ otimes \ rho _ {2} \ colon \ mathbb {Z} / 2 \ mathbb {Z} \ times \ mathbb {Z} / 3 \ mathbb {Z} \ to {\ text {GL}} (\ mathbb {C} ^ {2} \ otimes \ mathbb {C} ^ {3}) = {\ text {GL}} _ {6} (\ mathbb {C})}$

given by where The linear mapping belonging to the generator is then given in the base of by: ${\ displaystyle \ rho _ {1} (k) \ otimes \ rho _ {2} (l),}$${\ displaystyle k \ in \ mathbb {Z} / 2 \ mathbb {Z}, l \ in \ mathbb {Z} / 3 \ mathbb {Z}.}$
${\ displaystyle \ rho _ {1} ({\ overline {1}}) \ otimes \ rho _ {2} ({\ overline {1}}),}$${\ displaystyle ({\ overline {1}}, {\ overline {1}})}$${\ displaystyle \ mathbb {C} ^ {2} \ otimes \ mathbb {C} ^ {3} \ cong \ mathbb {C} ^ {6}}$

${\ displaystyle \ rho _ {1} ({\ overline {1}}) \ otimes \ rho _ {2} ({\ overline {1}}) = \ left ({\ begin {array} {cccccc} 0 & 0 & 0 & - i & 0 & -ie ^ {\ frac {2 \ pi i} {3}} \\ 0 & 0 & 0 & 0 & -ie ^ {\ frac {2 \ pi i} {3}} & 0 \\ 0 & 0 & 0 & 0 & 0 & -ie ^ {\ frac {4 \ pi i} {3}} \\ i & 0 & ie ^ {\ frac {2 \ pi i} {3}} & 0 & 0 & 0 \\ 0 & ie ^ {\ frac {2 \ pi i} {3}} & 0 & 0 & 0 & 0 \\ 0 & 0 & ie ^ {\ frac { 4 \ pi i} {3}} & 0 & 0 & 0 \ end {array}} \ right)}$

A comparison with the direct sum shows the difference. The representations obtained are also not of the same degree.

Symmetrical and alternating square

Let be a linear representation of and a basis of define ${\ displaystyle \ rho \ colon G \ to V \ otimes V}$${\ displaystyle G}$${\ displaystyle (e_ {k})}$${\ displaystyle V.}$

${\ displaystyle \ vartheta \ colon V \ otimes V \ to V \ otimes V,}$

in order we

${\ displaystyle \ vartheta (e_ {k} \ otimes e_ {j}) = e_ {j} \ otimes e_ {k}}$

continue linearly. Then applies

${\ displaystyle \ forall u, v \ in V \ colon \, \ theta (v \ otimes u) = u \ otimes v}$

and with that it breaks up into ${\ displaystyle \ vartheta ^ {2} = 1.}$${\ displaystyle V \ otimes V}$

${\ displaystyle V \ otimes V = {\ text {Sym}} ^ {2} (V) \ oplus {\ text {Alt}} ^ {2} (V),}$

in which

${\ displaystyle {\ text {Sym}} ^ {2} (V) = \ {z \ in V \ otimes V \ mid \ vartheta (z) = z \}}$

and

${\ displaystyle \ textstyle {\ text {Alt}} ^ {2} (V) = \ bigwedge ^ {2} (V) = \ {z \ in V \ otimes V \ mid \ vartheta (z) = - z \ }.}$

These subspaces are -invariant and thus define partial representations that are called symmetrical or alternating squares . These partial representations also exist for but are then referred to as hat product and symmetrical product . In this case , there is generally no more than the direct sum of the two products. ${\ displaystyle G}$${\ displaystyle V ^ {\ otimes m},}$ ${\ displaystyle \ textstyle \ bigwedge ^ {m} (V)}$ ${\ displaystyle {\ text {Sym}} ^ {m} (V)}$${\ displaystyle m> 2}$${\ displaystyle V ^ {\ otimes m}}$

Tensor product of modules

The term tensor product can be generalized from that of the tensor product of vector spaces over a field to that of the tensor product of modules over an (arbitrary, also non-commutative) ring with 1.

In order to obtain interesting objects also in the non-commutative case, the condition (3) has to be weakened slightly.

Structure of the elements

Elementary tensors

An elementary tensor (also pure or simple tensor ) in the tensor product is an element of the form with . ${\ displaystyle M \ otimes _ {R} N}$${\ displaystyle m \ otimes n}$${\ displaystyle m \ in M, \, n \ in N}$

General shape

Each element of the tensor product is a finite sum of elementary tensors. In general, not every tensor can be written as an elementary tensor.

For example, the tensor is not an elementary tensor in the tensor product , where the standard basis vectors are (but definitely). ${\ displaystyle e_ {1} \ otimes e_ {2} -e_ {2} \ otimes e_ {1}}$${\ displaystyle \ mathbb {R} ^ {2} \ otimes _ {\ mathbb {R}} \ mathbb {R} ^ {2}}$${\ displaystyle e_ {i}}$${\ displaystyle e_ {1} \ otimes e_ {1} -e_ {1} \ otimes e_ {2} -e_ {2} \ otimes e_ {1} + e_ {2} \ otimes e_ {2} = (e_ { 1} -e_ {2}) \ otimes (e_ {1} -e_ {2})}$

If a commutative ring and a module generated by an element , then every tensor of the tensor product is an elementary tensor for any module${\ displaystyle R}$${\ displaystyle M}$${\ displaystyle R}$${\ displaystyle M \ otimes _ {R} N}$${\ displaystyle R}$${\ displaystyle N.}$