# Kronecker product

In mathematics, the Kronecker product is a special product of two matrices of any size. The result of the Kronecker product is a large matrix that is created by considering all possible products of entries in the two output matrices. It is named after the German mathematician Leopold Kronecker .

## definition

If there is a matrix and a matrix, the Kronecker product is defined as ${\ displaystyle A}$${\ displaystyle m \ times n}$${\ displaystyle B}$${\ displaystyle p \ times r}$${\ displaystyle C = A \ otimes B}$

${\ displaystyle C = (a_ {ij} \ cdot B) = {\ begin {pmatrix} a_ {11} B & \ cdots & a_ {1n} B \\\ vdots & \ ddots & \ vdots \\ a_ {m1} B & \ cdots & a_ {mn} B \ end {pmatrix}}}$

Explicit:

${\ displaystyle A \ otimes B = {\ begin {pmatrix} a_ {11} b_ {11} & a_ {11} b_ {12} & \ cdots & a_ {11} b_ {1r} & \ cdots & \ cdots & a_ {1n } b_ {11} & a_ {1n} b_ {12} & \ cdots & a_ {1n} b_ {1r} \\ a_ {11} b_ {21} & a_ {11} b_ {22} & \ cdots & a_ {11} b_ {2r} & \ cdots & \ cdots & a_ {1n} b_ {21} & a_ {1n} b_ {22} & \ cdots & a_ {1n} b_ {2r} \\\ vdots & \ vdots & \ ddots & \ vdots &&& \ vdots & \ vdots & \ ddots & \ vdots \\ a_ {11} b_ {p1} & a_ {11} b_ {p2} & \ cdots & a_ {11} b_ {pr} & \ cdots & \ cdots & a_ {1n} b_ {p1} & a_ {1n} b_ {p2} & \ cdots & a_ {1n} b_ {pr} \\\ vdots & \ vdots && \ vdots & \ ddots && \ vdots & \ vdots && \ vdots \\\ vdots & \ vdots && \ vdots && \ ddots & \ vdots & \ vdots && \ vdots \\ a_ {m1} b_ {11} & a_ {m1} b_ {12} & \ cdots & a_ {m1} b_ {1r} & \ cdots & \ cdots & a_ {mn} b_ {11} & a_ {mn} b_ {12} & \ cdots & a_ {mn} b_ {1r} \\ a_ {m1} b_ {21} & a_ {m1} b_ {22} & \ cdots & a_ {m1} b_ {2r} & \ cdots & \ cdots & a_ {mn} b_ {21} & a_ {mn} b_ {22} & \ cdots & a_ {mn} b_ {2r} \\\ vdots & \ vdots & \ ddots & \ vdots &&& \ vdots & \ vdots & \ ddots & \ vdots \\ a_ {m1} b_ {p1} & a_ {m1} b_ {p2} & \ cdots & a_ {m1} b_ {pr} & \ cdots & \ cdots & a_ {mn} b_ {p1} & a_ {mn} b_ {p2} & \ cdots & a_ {mn} b_ {pr} \ end {pmatrix}} _ {(mp \ times nr)}}$.

That is, each element of the matrix is multiplied by the matrix . So the result is a matrix with rows and columns. ${\ displaystyle A}$${\ displaystyle B}$${\ displaystyle m \ cdot p}$${\ displaystyle n \ cdot r}$

## example

${\ displaystyle {\ begin {pmatrix} 1 & 2 \\ 3 & 4 \\ 5 & 6 \ end {pmatrix}} \ otimes {\ begin {pmatrix} 7 & 8 \\ 9 & 0 \ end {pmatrix}} = {\ begin {pmatrix} 1 \ cdot {\ begin {pmatrix} 7 & 8 \\ 9 & 0 \ end {pmatrix}} & 2 \ cdot {\ begin {pmatrix} 7 & 8 \\ 9 & 0 \ end {pmatrix}} \\\\ 3 \ cdot {\ begin {pmatrix} 7 & 8 \ \ 9 & 0 \ end {pmatrix}} & 4 \ cdot {\ begin {pmatrix} 7 & 8 \\ 9 & 0 \ end {pmatrix}} \\\\ 5 \ cdot {\ begin {pmatrix} 7 & 8 \\ 9 & 0 \ end {pmatrix}} & 6 \ cdot {\ begin {pmatrix} 7 & 8 \\ 9 & 0 \ end {pmatrix}} \ end {pmatrix}} = {\ begin {pmatrix} 7 & 8 & \! \! \! & 14 & 16 \\ 9 & 0 & \! \! \! & 18 & 0 \\ [0.6em] 21 & 24 & \! \! \! & 28 & 32 \\ 27 & 0 & \! \! \! & 36 & 0 \\ [0.6em] 35 & 40 & \! \! \! & 42 & 48 \\ 45 & 0 & \! \! \! & 54 & 0 \ end {pmatrix}}}$

## properties

### Calculation rules

The Kronecker product is not commutative , that is, in general

${\ displaystyle A \ otimes B \ neq B \ otimes A}$.

However, there are permutation matrices such that ${\ displaystyle P, Q}$

${\ displaystyle A \ otimes B = P (B \ otimes A) Q}$

applies. Are there and are square, you can choose . ${\ displaystyle A}$${\ displaystyle B}$${\ displaystyle P = Q ^ {T}}$

The Kronecker product is associative . That is, it applies

${\ displaystyle A \ otimes (B \ otimes C) = (A \ otimes B) \ otimes C}$.

### Symmetries

The following applies to the transposition

${\ displaystyle (A \ otimes B) ^ {T} = A ^ {T} \ otimes B ^ {T}}$.

The following applies to the conjugate matrix

${\ displaystyle {\ overline {A \ otimes B}} = {\ overline {A}} \ otimes {\ overline {B}}}$.

The following applies to the adjoint matrix

${\ displaystyle (A \ otimes B) ^ {*} = A ^ {*} \ otimes B ^ {*}}$.

### References to other operations

The Kronecker product is bilinear with the matrix addition , that is, it applies

${\ displaystyle A \ otimes (B + C) = A \ otimes B + A \ otimes C}$,
${\ displaystyle (B + C) \ otimes A = B \ otimes A + C \ otimes A}$

and

${\ displaystyle \ lambda (A \ otimes B) = (\ lambda A) \ otimes B = A \ otimes (\ lambda B)}$

If the matrix products and are defined, then applies ${\ displaystyle AC}$${\ displaystyle BD}$

${\ displaystyle AC \ otimes BD = (A \ otimes B) (C \ otimes D)}$.

### Parameters

If and are square matrices, then applies to the trace${\ displaystyle A}$${\ displaystyle B}$

${\ displaystyle \ mathrm {track} (A \ otimes B) = \ mathrm {track} (A) \ cdot \ mathrm {track} (B)}$.

The following applies to the rank

${\ displaystyle \ mathrm {Rank} (A \ otimes B) = \ mathrm {Rank} (A) \ cdot \ mathrm {Rank} (B)}$.

If one is and a matrix, then holds for the determinant${\ displaystyle A}$${\ displaystyle n \ times n}$${\ displaystyle B}$${\ displaystyle m \ times m}$

${\ displaystyle \ det (A \ otimes B) = {\ det} ^ {m} (A) \, {\ det} ^ {n} (B)}$.

If the eigenvalues of and are the eigenvalues ​​of , then: ${\ displaystyle (\ lambda _ {i}) _ {i = 1, \ dotsc, n} \,}$${\ displaystyle A}$${\ displaystyle (\ mu _ {j}) _ {j = 1, \ dotsc, m} \,}$${\ displaystyle B}$

${\ displaystyle (\ lambda _ {i} \, \ mu _ {j}) _ {i = 1, \ dotsc, n \ atop j = 1, \ dotsc, m}}$are the eigenvalues ​​of .${\ displaystyle A \ otimes B}$

The following applies to the spectral standard

${\ displaystyle \ | A \ otimes B \ | _ {2} = \ | A \ | _ {2} \ cdot \ | B \ | _ {2}}$.

### Inverse

If invertible , then invertible is also possible with the inverse${\ displaystyle A, B}$ ${\ displaystyle (A \ otimes B)}$

${\ displaystyle (A \ otimes B) ^ {- 1} = A ^ {- 1} \ otimes B ^ {- 1}}$.

The following also applies to the Moore-Penrose inverse

${\ displaystyle (A \ otimes B) ^ {+} = A ^ {+} \ otimes B ^ {+}}$.

More generally, if and are generalized inverses of and , then is a generalized inverse of . ${\ displaystyle A ^ {-}}$${\ displaystyle B ^ {-}}$${\ displaystyle A}$${\ displaystyle B}$${\ displaystyle A ^ {-} \ otimes B ^ {-}}$${\ displaystyle A \ otimes B}$

## Matrix equation

Let the matrices be given and a matrix sought such that it holds. Then the following equivalence applies: ${\ displaystyle A \ in \ operatorname {Mat} (k \ times \ ell), \, B \ in \ mathrm {Mat} (m \ times n), \, C \ in \ mathrm {Mat} (k \ times n)}$${\ displaystyle X \ in \ operatorname {Mat} (\ ell \ times m)}$${\ displaystyle AXB = C \,}$

${\ displaystyle AXB = C \ iff (B ^ {T} \ otimes A) \, \ operatorname {vec} (X) = \ operatorname {vec} (C)}$.

Here stands for the column-wise vectorization of a matrix to a column vector: If the columns of the matrix are , then is ${\ displaystyle \ operatorname {vec}}$${\ displaystyle {\ vec {x}} _ {1}, \ dotsc, {\ vec {x}} _ {m}}$${\ displaystyle X \ in \ operatorname {Mat} (\ ell \ times m)}$

${\ displaystyle \ operatorname {vec} (X) = {\ begin {pmatrix} {\ vec {x}} _ {1} \\\ vdots \\ {\ vec {x}} _ {m} \ end {pmatrix }}}$

a column vector of length . Analog is a column vector of length . ${\ displaystyle \ ell \ cdot m}$${\ displaystyle \ operatorname {vec} (C)}$${\ displaystyle k \ cdot n}$

Once the vector has been determined, the associated isomorphic matrix results directly from it . ${\ displaystyle \ operatorname {vec} (X)}$${\ displaystyle X \ in \ mathrm {Mat} (\ ell \ times m)}$

### Proof of equivalence

It is . ${\ displaystyle AXB = C \ iff AX \ left ({\ vec {b}} _ {1}, \ dotsc, {\ vec {b}} _ {n} \ right) = \ left ({\ vec {c }} _ {1}, \ dotsc, {\ vec {c}} _ {n} \ right) \ iff AX {\ vec {b_ {i}}} = {\ vec {c_ {i}}} \ iff {\ begin {pmatrix} AX {\ vec {b}} _ {1} \\\ vdots \\ AX {\ vec {b}} _ {n} \ end {pmatrix}} = \ operatorname {vec} (C )}$

It is . ${\ displaystyle {\ begin {pmatrix} A ({\ vec {x}} _ {1}, \ dotsc, {\ vec {x}} _ {m}) {\ vec {b}} _ {1} \ \\ vdots \\ A ({\ vec {x}} _ {1}, \ dotsc, {\ vec {x}} _ {m}) {\ vec {b}} _ {n} \ end {pmatrix} } = {\ begin {pmatrix} A (b_ {11} {\ vec {x}} _ {1} + \ dotsc + b_ {m1} {\ vec {x}} _ {m}) \\\ vdots \ \ A (b_ {1n} {\ vec {x}} _ {1} + \ dotsc + b_ {mn} {\ vec {x}} _ {m}) \ end {pmatrix}} = {\ begin {pmatrix } A \, b_ {11} & \ cdots & A \, b_ {m1} \\\ vdots & \ ddots & \ vdots \\ A \, b_ {1n} & \ cdots & A \, b_ {mn} \ end { pmatrix}} {\ begin {pmatrix} {\ vec {x}} _ {1} \\\ vdots \\ {\ vec {x}} _ {m} \ end {pmatrix}} = (B ^ {T} \ otimes A) \, \ operatorname {vec} (X)}$

## System of equations with matrix coefficients

For and let the matrices be given. ${\ displaystyle i = 1, ..., r \,}$${\ displaystyle j = 1, ..., s \,}$${\ displaystyle A_ {ij} \ in \ mathrm {Mat} (k \ times \ ell), \, B_ {ij} \ in \ mathrm {Mat} (m \ times n), \, C_ {i} \ in \ mathrm {Mat} (k \ times n)}$

We are looking for the matrices that form the system of equations ${\ displaystyle X_ {i} \ in \ mathrm {Mat} (\ ell \ times m)}$

${\ displaystyle {\ begin {bmatrix} A_ {11} X_ {1} B_ {11} + ... + A_ {1s} X_ {s} B_ {1s} & = & C_ {1} \\ & \ vdots & \\ A_ {r1} X_ {1} B_ {r1} + ... + A_ {rs} X_ {s} B_ {rs} & = & C_ {r} \\\ end {bmatrix}}}$

to solve. This task is equivalent to solving the system of equations

${\ displaystyle {\ begin {pmatrix} B_ {11} ^ {T} \ otimes A_ {11} & \ cdots & B_ {1s} ^ {T} \ otimes A_ {1s} \\\ vdots & \ ddots & \ vdots \\ B_ {r1} ^ {T} \ otimes A_ {r1} & \ cdots & B_ {rs} ^ {T} \ otimes A_ {rs} \\\ end {pmatrix}} {\ begin {pmatrix} \ operatorname { vec} \, X_ {1} \\\ vdots \\\ operatorname {vec} \, X_ {s} \ end {pmatrix}} = {\ begin {pmatrix} \ operatorname {vec} \, C_ {1} \ \\ vdots \\\ operatorname {vec} \, C_ {r} \ end {pmatrix}}}$

## Other uses

The Kronecker product is used, for example, in generalized linear regression models to construct a covariance matrix of correlated disturbance variables (e.g. the covariance matrix for seemingly unconnected regression equations, see covariance matrix # covariance matrix for apparently unrelated regression equations ). A block diagonal cellular matrix is obtained here .

In addition, the Kronecker product is needed in quantum mechanics to describe systems with several particles that have a spectrum that is limited on both sides. States of several particles are then Kronecker products of the single-particle states. In the case of an unrestricted spectrum, only the algebraic structure of a Kronecker product remains, since then there is no representation by matrices.

## Connection with tensor products

Given are two linear mappings and between finite-dimensional vector spaces . Then there is always exactly one linear mapping ${\ displaystyle \ varphi _ {1} \ colon V_ {1} \ longrightarrow W_ {1}}$${\ displaystyle \ varphi _ {2} \ colon V_ {2} \ longrightarrow W_ {2}}$

${\ displaystyle \ varphi _ {1} \ otimes \ varphi _ {2} \ colon V_ {1} \ otimes V_ {2} \ longrightarrow W_ {1} \ otimes W_ {2}}$

between the tensor products with

${\ displaystyle [\ varphi _ {1} \ otimes \ varphi _ {2}] (v_ {1} \ otimes v_ {2}) = \ varphi _ {1} (v_ {1}) \ otimes \ varphi _ { 2} (v_ {2})}$.

If we on the vector spaces and select one each base so we can figure their representation matrix to assign. Let it be the representation matrix of . ${\ displaystyle V_ {1}, W_ {1}, V_ {2}}$${\ displaystyle W_ {2}}$${\ displaystyle \ varphi _ {1}}$ ${\ displaystyle A}$${\ displaystyle B}$${\ displaystyle \ varphi _ {2}}$

The Kronecker product of the representation matrices now corresponds exactly to the representation matrix of the tensored mapping , if one takes the basis on and , which results from the lexicographically arranged pairs of basis vectors of the vector spaces involved in the tensor product: Are the selected basis of and the basis of , so we take ${\ displaystyle A \ otimes B}$${\ displaystyle \ varphi _ {1} \ otimes \ varphi _ {2}}$${\ displaystyle V_ {1} \ otimes V_ {2}}$${\ displaystyle W_ {1} \ otimes W_ {2}}$${\ displaystyle (e_ {1}, e_ {2}, \ ldots, e_ {n})}$${\ displaystyle V_ {1}}$${\ displaystyle (f_ {1}, f_ {2}, \ ldots, f_ {p})}$${\ displaystyle V_ {2}}$

${\ displaystyle (e_ {1} \ otimes f_ {1}, e_ {1} \ otimes f_ {2}, \ ldots, e_ {1} \ otimes f_ {p}, e_ {2} \ otimes f_ {1} , \ ldots, e_ {n} \ otimes f_ {p-1}, e_ {n} \ otimes f_ {p})}$

as the basis for the tensor product . Analog for . ${\ displaystyle V_ {1} \ otimes V_ {2}}$${\ displaystyle W_ {1} \ otimes W_ {2}}$

## Historical

The Kronecker product is named after Leopold Kronecker , although Georg Zehfuss defined the product as early as 1858, which is why the Kronecker product is sometimes also called the Zehfuss product.