# Product rule

The product rule or Leibniz rule (after G. W. Leibniz ) is a fundamental rule of differential calculus . It reduces the calculation of the derivative of a product of functions to the calculation of the derivatives of the individual functions.

One application of the product rule in integral calculus is the method of partial integration . In the event that one of the two functions is constant, the product rule changes over to the simpler factor rule .

## Statement of the product rule

If the functions and are differentiable from an interval into the set of real or complex numbers at one point , then that is also through ${\ displaystyle u}$${\ displaystyle v}$ ${\ displaystyle D}$${\ displaystyle x_ {a}}$

${\ displaystyle f (x) = u (x) \ cdot v (x)}$ for all ${\ displaystyle x \ in D}$

defined function at the point differentiable, and it applies ${\ displaystyle f}$${\ displaystyle x = x_ {a}}$

${\ displaystyle f '(x_ {a}) = u' (x_ {a}) \ cdot v (x_ {a}) + u (x_ {a}) \ cdot v '(x_ {a})}$

or short:

${\ displaystyle (uv) '= u'v + uv'}$

## Application examples

In the following always be ${\ displaystyle f (x) = u (x) v (x).}$

• Is and so one receives the statement from the knowledge of and with the product rule${\ displaystyle u (x) = x}$${\ displaystyle v (x) = x,}$${\ displaystyle u '(x) = 1}$${\ displaystyle v '(x) = 1}$
${\ displaystyle {\ frac {\ mathrm {d}} {\ mathrm {d} x}} x ^ {2} = f '(x) = u' (x) v (x) + u (x) v ' (x) = 1 \ times x + x \ times 1 = 2x.}$
• Is and so is so is${\ displaystyle u (x) = x}$${\ displaystyle v (x) = {\ frac {1} {x}},}$${\ displaystyle f (x) = u (x) v (x) = 1,}$
${\ displaystyle 0 = f '(x) = u' (x) v (x) + u (x) v '(x) = 1 \ cdot {\ frac {1} {x}} + x \ cdot v' (x),}$
and by reshaping you get the statement
${\ displaystyle v '(x) = - {\ frac {1} {x ^ {2}}}.}$

If you use the short notation you get, for example, the derivation of the following function ${\ displaystyle (u \ color {Blue} v \ color {Black}) '= u' \ color {Blue} v \ color {Black} + u \ color {Blue} v \ color {Black} '}$

{\ displaystyle {\ begin {aligned} f (x) & = (x ^ {2} -4) \ cdot \ color {Blue} (x ^ {3} +1) \\ f '(x) & = 2x \ cdot \ color {Blue} (x ^ {3} +1) \ color {Black} + \ color {Black} (x ^ {2} -4) \ cdot \ color {Blue} 3x ^ {2} \\ \ end {aligned}}}

When multiplied it results ${\ displaystyle f '(x) = 5x ^ {4} -12x ^ {2} + 2x}$

## Explanation and evidence

Geometric illustration of the proof of the product rule

The product of two real functions (which can be differentiated at one point ) and has at that point the value that can be interpreted as the area of ​​a rectangle with the sides and . If going to such changes in order and to the change of the area will come back (see figure) composed of ${\ displaystyle u \ cdot v}$${\ displaystyle x}$${\ displaystyle u}$${\ displaystyle v}$${\ displaystyle x}$${\ displaystyle u (x) \ cdot v (x),}$${\ displaystyle u (x)}$${\ displaystyle v (x)}$${\ displaystyle x}$${\ displaystyle \ Delta x,}$${\ displaystyle u (x)}$${\ displaystyle \ Delta u (x)}$${\ displaystyle v (x)}$${\ displaystyle \ Delta v (x).}$${\ displaystyle \ Delta (u (x) \ cdot v (x))}$${\ displaystyle u (x) \ cdot v (x)}$

${\ displaystyle \ Delta (u (x) \ cdot v (x)) = u (x) \ cdot \ Delta v (x) + v (x) \ cdot \ Delta u (x) + \ Delta u (x) \ cdot \ Delta v (x).}$

If you divide by then you get with ${\ displaystyle \ Delta x,}$

${\ displaystyle {\ Delta (u (x) \ cdot v (x)) \ over \ Delta x} = u (x) \ cdot {\ Delta v (x) \ over \ Delta x} + v (x) \ cdot {\ Delta u (x) \ over \ Delta x} + {\ Delta u (x) \ over \ Delta x} \ cdot \ Delta v (x)}$

the difference quotient of the product or area function at the point${\ displaystyle u \ cdot v}$${\ displaystyle x.}$

For against also strives (and thus the whole last summand) against so that one is at the point${\ displaystyle \ Delta x}$${\ displaystyle 0}$${\ displaystyle \ Delta v (x)}$${\ displaystyle 0,}$${\ displaystyle x}$

${\ displaystyle (u \ cdot v) '= u \ cdot v' + v \ cdot u '}$

received as claimed. This is also essentially the line of argument found in a first proof of the product rule in 1677 in a Leibniz manuscript. The product rule, which he proves there together with the quotient rule , was thus one of the first rules for the application of the infinitesimal calculus that he derived. However, he did not use a limit value , but also differentials and concluded that it is not applicable because it is infinitesimally small in comparison to the other summands. Euler used the same argument, only with Cauchy is there a proof with limit values: ${\ displaystyle \ Delta u \ cdot \ Delta v}$

The function is given by The derivation of at one point is then by the limit value of the difference quotient ${\ displaystyle f}$${\ displaystyle f (x) = u (x) \ cdot v (x).}$${\ displaystyle f}$${\ displaystyle x}$

${\ displaystyle \ lim _ {\ Delta x \ to 0} {\ frac {u (x + \ Delta x) \ cdot v (x + \ Delta x) -u (x) \ cdot v (x)} {\ Delta x }}}$

given. Addition and subtraction of the term provides ${\ displaystyle {\ tfrac {u (x) \ cdot v (x + \ Delta x)} {\ Delta x}}}$

${\ displaystyle \ lim _ {\ Delta x \ to 0} {\ frac {u (x + \ Delta x) -u (x)} {\ Delta x}} \ cdot v (x + \ Delta x) + \ lim _ {\ Delta x \ to 0} u (x) \ cdot {\ frac {v (x + \ Delta x) -v (x)} {\ Delta x}}.}$

Executing the two border crossings provides the product rule ${\ displaystyle f '(x) = u' (x) \ cdot v (x) + u (x) \ cdot v '(x).}$

## Generalizations

### Vector products and matrix vector products

When proving the product rule, linear combinations (sums, differences, products with numbers) are formed from the values, as are the values ​​of the roles of and are clearly separated: is the left factor, the right one. The proof therefore carries over to all product formations that are linear in both the left and the right factor. In particular, the product rule also applies to ${\ displaystyle u}$ ${\ displaystyle v.}$${\ displaystyle u}$${\ displaystyle v}$${\ displaystyle u}$${\ displaystyle v}$

Vectors or matrices are to be understood as functions of an independent variable.

### More than two factors

The product rule can also be applied successively to several factors. That would be

${\ displaystyle (uvw) '= u'vw + uv'w + uvw'}$ and
${\ displaystyle (uvwz) '= u'vwz + uv'wz + uvw'z + uvwz'}$ etc.

In general , the derivative is for a function that can be written as a product of functions${\ displaystyle \ textstyle f = \ prod _ {i = 1} ^ {n} f_ {i},}$${\ displaystyle n}$${\ displaystyle f_ {i}}$

${\ displaystyle f '= \ sum _ {i = 1} ^ {n} f_ {i}' \ prod _ {k = 1 \ atop k \ neq i} ^ {n} f_ {k}.}$

If the functions have no zeros, this rule can also be used in a clear form

${\ displaystyle {\ frac {(f_ {1} \ cdots f_ {n}) '} {f_ {1} \ cdots f_ {n}}} = {\ frac {f_ {1}'} {f_ {1} }} + \ cdots + {\ frac {f_ {n} '} {f_ {n}}}}$   (or short )${\ displaystyle {\ frac {f '} {f}} = \ sum {\ frac {f_ {i}'} {f_ {i}}}}$

write; such fractions are called logarithmic derivatives .

### Higher derivatives

The rule for derivatives of the -th order for a product of two functions was already known to Leibniz and is accordingly sometimes also referred to as Leibniz's rule . It results from the product rule by means of complete induction to ${\ displaystyle n}$

${\ displaystyle (uv) ^ {(n)} = \ sum _ {k = 0} ^ {n} {n \ choose k} u ^ {(k)} v ^ {(nk)}.}$

The expressions of the form appearing here are binomial coefficients . The above formula contains the actual product rule as a special case. It is strikingly similar to the binomial theorem${\ displaystyle {\ tbinom {n} {k}}}$

${\ displaystyle (a + b) ^ {n} = \ sum _ {k = 0} ^ {n} {n \ choose k} a ^ {k} b ^ {nk}.}$

This similarity is no accident; the usual induction proof runs completely analogously in both cases; but one can also prove Leibniz's rule with the help of the binomial theorem.

For higher derivatives of more than two factors, the multinomial theorem can be transferred accordingly . The following applies:

${\ displaystyle (u_ {1} u_ {2} \ ldots u_ {k}) ^ {(n)} \, = \ sum _ {n_ {1} + \ ldots + n_ {k} = n} {n \ choose n_ {1}, \ ldots, n_ {k}} \, u_ {1} ^ {(n_ {1})} u_ {2} ^ {(n_ {2})} \ ldots u_ {k} ^ { (n_ {k})}}$

### Higher-dimensional domain of definition

If one generalizes to functions with a higher-dimensional domain, the product rule can be formulated as follows: Let there be an open subset , differentiable functions and a direction vector. Then the product rule for the directional derivative applies : ${\ displaystyle U \ subseteq \ mathbb {R} ^ {n}}$${\ displaystyle u, v \ colon U \ to \ mathbb {R}}$${\ displaystyle x \ in \ mathbb {R} ^ {n}}$

${\ displaystyle {\ frac {\ partial} {\ partial x}} (uv) = \ left ({\ frac {\ partial} {\ partial x}} u \ right) \ cdot v + u \ cdot {\ frac {\ partial} {\ partial x}} v.}$

The same applies to the gradients

${\ displaystyle \ nabla (uv) = (\ nabla u) \ cdot v + u \ cdot \ nabla v.}$

In the language of differentiable manifolds these two statements read:

• If a tangential vector and locally differentiable functions, then applies${\ displaystyle x}$${\ displaystyle u, \ v}$
${\ displaystyle x (uv) = xu \ cdot v + u \ cdot xv.}$
• If functions are locally differentiable, then the following relationship applies between the outer derivatives :${\ displaystyle u, v}$
${\ displaystyle \ mathrm {d} (uv) = v \, \ mathrm {d} u + u \, \ mathrm {d} v}$

#### Higher partial derivatives

Let then: ${\ displaystyle \ alpha, \ beta \ in \ mathbb {N} _ {0} ^ {n}, U \ subseteq \ mathbb {R} ^ {n} ~ and ~ u, v \ in C ^ {\ alpha} (U, \ mathbb {R}).}$

${\ displaystyle D ^ {\ alpha} (uv) = \ sum _ {\ beta \ leq \ alpha} {\ alpha \ choose \ beta} D ^ {\ beta} uD ^ {\ alpha - \ beta} v}$

### Holomorphic functions

The product rule also applies to complex differentiable functions: Let be and holomorphic . Then is holomorphic, and it holds ${\ displaystyle U \ subseteq \ mathbb {C}}$${\ displaystyle f, g \ colon U \ to \ mathbb {C}}$ ${\ displaystyle fg}$

${\ displaystyle (fg) '= f'g + fg'.}$

### General differentiable maps

There are an open interval, a Banach (for example, the algebra of real or complex. - matrices ) and differentiable functions. Then: ${\ displaystyle U \ subseteq \ mathbb {R}}$${\ displaystyle B}$${\ displaystyle (n \ times n)}$${\ displaystyle u, v \ colon U \ to B}$

${\ displaystyle (u \ cdot v) '= u' \ cdot v + u \ cdot v '}$

"·" Denotes the multiplication in the Banach algebra.

If more general and Banach spaces , and differentiable functions, then a product rule also applies, whereby the function of the product is taken over by a bilinear form . This is required to be continuous , i.e. restricted : ${\ displaystyle B ^ {\ prime}}$${\ displaystyle B ''}$ ${\ displaystyle u \ colon U \ to B '}$${\ displaystyle v \ colon U \ to B ''}$ ${\ displaystyle A \ colon B '\ times B' '\ to \ mathbb {R}}$

${\ displaystyle | A (b ', b' ') | \ leq C \ cdot \ | b' \ | \ cdot \ | b '' \ |}$ for all ${\ displaystyle b '\ in B', b '' \ in B ''}$

with a fixed constant . Then the product rule applies ${\ displaystyle C}$

${\ displaystyle {\ frac {\ mathrm {d}} {\ mathrm {d} x}} A (u (x), v (x)) = A (u '(x), v (x)) + A (u (x), v '(x)).}$

Corresponding statements apply to higher-dimensional domains of definition.

### Leibniz rule for divided differences

The Leibniz rule can be applied to divided differences :

${\ displaystyle [x_ {0}, \ ldots, x_ {n}] (f \ cdot g) = \ sum _ {i = 0} ^ {n} ([x_ {0}, \ ldots, x_ {i} ] f) \ cdot ([x_ {i}, \ ldots, x_ {n}] g)}$

The special case

${\ displaystyle [x, x] (f \ cdot g) = [x] f \ cdot [x, x] g + [x, x] f \ cdot [x] g = f (x) g '(x) + f '(x) g (x)}$

includes the original Leibniz rule.

### Abstraction: derivations

In general, one calls images which the product rule ${\ displaystyle D,}$

${\ displaystyle D (uv) = v \ cdot D (u) + u \ cdot D (v)}$

meet, derivatives. The order of the factors here is the case of a Derivation with algebra and one - Links module selected. ${\ displaystyle A \ to M}$ ${\ displaystyle A}$${\ displaystyle A}$ ${\ displaystyle M}$

In connection with - or - graduated algebras (“ super algebras ”), however, the term derivation must be replaced by that of anti- derivation . The corresponding equation then reads ${\ displaystyle \ mathbb {Z}}$${\ displaystyle \ mathbb {Z} / 2 \ mathbb {Z}}$

${\ displaystyle D (uv) = D (u) \ cdot v + (- 1) ^ {| u |} \ cdot u \ cdot D (v)}$

for homogeneous elements here denotes the degree of The most prominent example of an anti-derivation is the outer derivation for differential forms${\ displaystyle u, \ v.}$${\ displaystyle | u |}$${\ displaystyle u.}$

${\ displaystyle \ mathrm {d} (\ omega \ wedge \ eta) = \ mathrm {d} \ omega \ wedge \ eta + (- 1) ^ {| \ omega |} \ cdot \ omega \ wedge \ mathrm {d } \ eta.}$

## literature

The product rule for functions is explained in any book that deals with differential calculus in a general way.

• Otto Forster : Analysis 1. Differential and integral calculus of a variable. 7th edition. Vieweg, Braunschweig 2004, ISBN 3-528-67224-2 .
• Otto Forster: Analysis 2. Differential calculus in R n . Ordinary differential equations. 6th edition. Vieweg, Braunschweig 2005, ISBN 3-528-47231-6 .
• Konrad Königsberger : Analysis. 2 vols. Springer, Berlin 2004, ISBN 3-540-41282-4 .
• CH Edwards Jr .: The Historical Development of the Calculus. Springer, New York 1979.