In mathematics , symmetric algebras are used to define polynomials over any vector space . They play an important role, for example, in the theory of Lie groups and in the theory of characteristic classes .
Formal definition
Let it be a vector space over a body . Be further
V
{\ displaystyle V}
K
{\ displaystyle K}
T
k
(
V
)
=
V
⊗
⋯
⊗
V
⏟
k
-times
{\ displaystyle T ^ {k} (V) = \ underbrace {V \ otimes \ cdots \ otimes V} _ {k {\ text {times}}}}
the -fold tensor product of with the conventions and . The direct sum
k
{\ displaystyle k}
V
{\ displaystyle V}
T
0
(
V
)
=
K
{\ displaystyle T ^ {0} (V) = K}
T
1
(
V
)
=
V
{\ displaystyle T ^ {1} (V) = V}
T
(
V
)
=
⨁
k
=
0
∞
T
k
(
V
)
{\ displaystyle T (V) = \ bigoplus _ {k = 0} ^ {\ infty} T ^ {k} (V)}
is the tensor algebra of .
V
{\ displaystyle V}
The two-sided, homogeneous ideal is generated by differences of elementary tensors with "reversed order":
I.
(
V
)
⊆
T
(
V
)
{\ displaystyle I (V) \ subseteq T (V)}
I.
(
V
)
: =
s
p
a
n
{
v
⊗
w
-
w
⊗
v
|
v
,
w
∈
V
}
{\ displaystyle I (V): = \ mathrm {span} \ left \ {v \ otimes ww \ otimes v {\ Big |} \; v, w \ in V \ right \}}
.
The symmetric algebra is then defined as the quotient space
S.
(
V
)
=
T
(
V
)
/
I.
(
V
)
{\ displaystyle S (V) = T (V) / I (V)}
.
The -th symmetric power of is defined as the image of in , it is denoted by. You have a decomposition
k
{\ displaystyle k}
V
{\ displaystyle V}
T
k
(
V
)
{\ displaystyle T ^ {k} (V)}
S.
(
V
)
{\ displaystyle S (V)}
S.
k
(
V
)
{\ displaystyle S ^ {k} (V)}
S.
(
V
)
=
⨁
k
=
0
∞
S.
k
(
V
)
{\ displaystyle S (V) = \ bigoplus _ {k = 0} ^ {\ infty} S ^ {k} (V)}
.
The product in symmetric algebra is traditionally written as.
a
b
{\ displaystyle from}
Similarly, one can define the symmetric algebra of modules using commutative rings .
Examples
For is isomorphic to the polynomial ring .
V
=
K
{\ displaystyle V = K}
S.
(
V
)
{\ displaystyle S (V)}
K
[
X
]
{\ displaystyle K [X]}
In general, the elements of can be interpreted as polynomials in the elements of a firmly chosen basis of .
S.
(
V
)
{\ displaystyle S (V)}
K
{\ displaystyle K}
V
{\ displaystyle V}
Especially for the vector space of - matrices over , you can see the elements of interpreted as polynomials in the entries of the matrices:
V
: =
G
l
(
n
,
K
)
=
M.
a
t
(
n
,
K
)
{\ displaystyle V: = {\ mathfrak {g}} l (n, K) = Mat (n, K)}
n
×
n
{\ displaystyle n \ times n}
K
{\ displaystyle K}
S.
(
V
)
{\ displaystyle S (V)}
S.
(
G
l
(
n
,
K
)
)
≃
K
[
x
11
,
...
,
x
n
n
]
{\ displaystyle S ({\ mathfrak {g}} l (n, K)) \ simeq K \ left [x_ {11}, \ ldots, x_ {nn} \ right]}
.
Polynomials over vector spaces
Polynomials of degree over a vector space are - by definition - the elements from , where denotes the dual space . These polynomials are linear maps
k
{\ displaystyle k}
K
{\ displaystyle \ mathbb {K}}
V
{\ displaystyle V}
S.
k
(
V
∗
)
{\ displaystyle S ^ {k} (V ^ {*})}
V
∗
{\ displaystyle V ^ {*}}
P
:
V
⊗
⋯
⊗
V
⏟
k
-times
→
K
{\ displaystyle P: \ underbrace {V \ otimes \ cdots \ otimes V} _ {k {\ text {-mal}}} \ rightarrow \ mathbb {K}}
which are invariant under the action of the symmetric group . (Note that such a polynomial is already uniquely determined by its values for all of them .)
S.
k
{\ displaystyle S_ {k}}
P
(
x
,
x
,
...
,
x
)
{\ displaystyle P (x, x, \ ldots, x)}
x
∈
V
{\ displaystyle x \ in V}
The product
S.
k
(
V
∗
)
⊗
S.
l
(
V
∗
)
→
S.
k
+
l
(
V
∗
)
{\ displaystyle S ^ {k} (V ^ {*}) \ otimes S ^ {l} (V ^ {*}) \ rightarrow S ^ {k + l} (V ^ {*})}
is defined by
(
P
Q
)
(
v
1
,
...
,
v
k
+
l
)
=
1
(
k
+
l
)
!
∑
σ
∈
S.
k
+
l
P
(
v
σ
(
1
)
,
...
.
v
σ
(
k
)
)
Q
(
v
σ
(
k
+
1
)
,
...
,
v
σ
(
k
+
l
)
)
{\ displaystyle (PQ) (v_ {1}, \ ldots, v_ {k + l}) = {\ frac {1} {(k + l)!}} \ sum _ {\ sigma \ in S_ {k + l}} P (v _ {\ sigma (1)}, \ ldots .v _ {\ sigma (k)}) Q (v _ {\ sigma (k + 1)}, \ ldots, v _ {\ sigma (k + l )})}
.
See also
literature
<img src="https://de.wikipedia.org/wiki/Special:CentralAutoLogin/start?type=1x1" alt="" title="" width="1" height="1" style="border: none; position: absolute;">