The Itō lemma (also Itō formula ), named after the Japanese mathematician Itō Kiyoshi , is a central statement in stochastic analysis . In its simplest form it is an integral representation for stochastic processes that are functions of a Wiener process . It thus corresponds to the chain rule or substitution rule of the classic differential and integral calculus .
Version for Wiener processes
Let be a (standard) Wiener process and a twice continuously differentiable function. Then applies
(
W.
t
)
t
≥
0
{\ displaystyle (W_ {t}) _ {t \ geq 0}}
H
:
R.
→
R.
{\ displaystyle h \ colon \ mathbb {R} \ to \ mathbb {R}}
H
(
W.
t
)
=
H
(
W.
0
)
+
∫
0
t
H
′
(
W.
s
)
d
W.
s
+
1
2
∫
0
t
H
″
(
W.
s
)
d
s
.
{\ displaystyle h (W_ {t}) = h (W_ {0}) + \ int _ {0} ^ {t} h '(W_ {s}) \, {\ rm {d}} W_ {s} + {\ frac {1} {2}} \ int _ {0} ^ {t} h '' (W_ {s}) \, {\ rm {d}} s \ ,.}
The first integral is to be understood as an Itō integral and the second integral as an ordinary Riemann integral (over the continuous paths of the integrand).
For the process defined by for , this representation is in differential notation
Y
t
=
H
(
W.
t
)
{\ displaystyle Y_ {t} = h (W_ {t})}
t
≥
0
{\ displaystyle t \ geq 0}
d
Y
t
=
H
′
(
W.
t
)
d
W.
t
+
1
2
H
″
(
W.
t
)
d
t
.
{\ displaystyle {\ rm {d}} Y_ {t} = h '(W_ {t}) \, {\ rm {d}} W_ {t} + {\ frac {1} {2}} h' ' (W_ {t}) \, {\ rm {d}} t \ ,.}
Version for Itō processes
A stochastic process is called an Itō process , if
(
X
t
)
t
≥
0
{\ displaystyle (X_ {t}) _ {t \ geq 0}}
X
t
=
X
0
+
∫
0
t
a
s
d
s
+
∫
0
t
b
s
d
W.
s
{\ displaystyle X_ {t} = X_ {0} + \ int _ {0} ^ {t} a_ {s} \, {\ rm {d}} s + \ int _ {0} ^ {t} b_ {s } \, {\ rm {d}} W_ {s}}
for two stochastic processes , applies (more on this under stochastic integration ). In differential notation:
a
s
{\ displaystyle a_ {s}}
b
s
{\ displaystyle b_ {s}}
d
X
t
=
a
t
d
t
+
b
t
d
W.
t
.
{\ displaystyle {\ rm {d}} X_ {t} = a_ {t} \, {\ rm {d}} t + b_ {t} \, {\ rm {d}} W_ {t} \ ,. }
If
a function is continuously differentiable once in the first component and twice in the second, then the process defined by it is also an Itō process, and it applies
H
:
R.
+
×
R.
→
R.
{\ displaystyle h \ colon \ mathbb {R} _ {+} \ times \ mathbb {R} \ to \ mathbb {R}}
Y
t
: =
H
(
t
,
X
t
)
{\ displaystyle Y_ {t}: = h (t, X_ {t})}
d
Y
t
=
∂
H
∂
t
(
t
,
X
t
)
d
t
+
∂
H
∂
x
(
t
,
X
t
)
d
X
t
+
1
2
∂
2
H
∂
x
2
(
t
,
X
t
)
(
d
X
t
)
2
=
(
∂
H
∂
x
(
t
,
X
t
)
a
t
+
∂
H
∂
t
(
t
,
X
t
)
+
1
2
∂
2
H
∂
x
2
(
t
,
X
t
)
b
t
2
)
d
t
+
∂
H
∂
x
(
t
,
X
t
)
b
t
d
W.
t
.
{\ displaystyle {\ begin {aligned} {\ rm {d}} Y_ {t} & = {\ frac {\ partial h} {\ partial t}} (t, X_ {t}) \, {\ rm { d}} t + {\ frac {\ partial h} {\ partial x}} (t, X_ {t}) \, {\ rm {d}} X_ {t} + {\ frac {1} {2}} {\ frac {\ partial ^ {2} h} {\ partial x ^ {2}}} (t, X_ {t}) ({\ rm {d}} X_ {t}) ^ {2} \\ & = \ left ({\ frac {\ partial h} {\ partial x}} (t, X_ {t}) \, a_ {t} + {\ frac {\ partial h} {\ partial t}} (t, X_ {t}) + {\ frac {1} {2}} {\ frac {\ partial ^ {2} h} {\ partial x ^ {2}}} (t, X_ {t}) \, b_ { t} ^ {2} \ right) {\ rm {d}} t + {\ frac {\ partial h} {\ partial x}} (t, X_ {t}) \, b_ {t} \, {\ rm {d}} W_ {t} \,. \ end {aligned}}}
Here and denote the partial derivatives of the function according to the first or second variable. The second representation follows from the first by inserting and combining the - and - terms.
∂
H
∂
t
{\ displaystyle {\ tfrac {\ partial h} {\ partial t}}}
∂
H
∂
x
{\ displaystyle {\ tfrac {\ partial h} {\ partial x}}}
H
{\ displaystyle h}
(
d
X
t
)
2
=
b
t
2
d
t
{\ displaystyle ({\ rm {d}} X_ {t}) ^ {2} = b_ {t} ^ {2} \, {\ rm {d}} t}
d
t
{\ displaystyle {\ rm {d}} t}
d
W.
t
{\ displaystyle {\ rm {d}} W_ {t}}
Version for semimartingales
Be a -valent semimartingale and be . Then there is a semi-martingale again and it applies
(
X
t
)
t
≥
0
=
(
X
t
1
,
...
,
X
t
d
)
t
≥
0
{\ displaystyle (X_ {t}) _ {t \ geq 0} = (X_ {t} ^ {1}, \ dotsc, X_ {t} ^ {d}) _ {t \ geq 0}}
R.
d
{\ displaystyle \ mathbb {R} ^ {d}}
F.
∈
C.
2
(
R.
d
,
R.
)
{\ displaystyle F \ in C ^ {2} (\ mathbb {R} ^ {d}, \ mathbb {R})}
(
F.
(
X
t
)
)
t
≥
0
{\ displaystyle (F (X_ {t})) _ {t \ geq 0}}
F.
(
X
t
)
-
F.
(
X
0
)
=
∑
j
=
1
d
∫
0
t
∂
F.
∂
x
j
(
X
s
-
)
d
X
s
j
+
1
2
∑
j
,
k
=
1
d
∫
0
t
∂
2
F.
∂
x
j
∂
x
k
(
X
s
-
)
d
[
X
j
,
X
k
]
s
c
+
∑
0
<
s
≤
t
(
F.
(
X
s
)
-
F.
(
X
s
-
)
-
∑
j
=
1
d
∂
F.
∂
x
j
(
X
s
-
)
Δ
X
s
j
)
.
{\ displaystyle {\ begin {aligned} F (X_ {t}) - F (X_ {0}) = & \ sum _ {j = 1} ^ {d} \ int _ {0} ^ {t} {\ frac {\ partial F} {\ partial x ^ {j}}} (X_ {s -}) {\ rm {d}} X_ {s} ^ {j} + {\ frac {1} {2}} \ sum _ {j, k = 1} ^ {d} \ int _ {0} ^ {t} {\ frac {\ partial ^ {2} F} {\ partial x ^ {j} \ partial x ^ {k} }} (X_ {s -}) {\ rm {d}} [X ^ {j}, X ^ {k}] _ {s} ^ {c} \\ & {} + \ sum _ {0 <s \ leq t} \ left (F (X_ {s}) - F (X_ {s -}) - \ sum _ {j = 1} ^ {d} {\ frac {\ partial F} {\ partial x ^ { j}}} (X_ {s -}) \ Delta X_ {s} ^ {j} \ right). \ end {aligned}}}
Here is the left-hand limit value and the associated jump process . With is the quadratic covariation of steady fractions of components and designated. If is a continuous semimartingale, the last sum in the formula vanishes and it holds .
X
s
-
=
lim
u
↑
s
X
u
{\ displaystyle \ textstyle X_ {s -} = \ lim _ {u \ uparrow s} X_ {u}}
Δ
X
s
j
=
X
s
j
-
X
s
-
j
{\ displaystyle \ Delta X_ {s} ^ {j} = X_ {s} ^ {j} -X_ {s -} ^ {j}}
[
X
j
,
X
k
]
c
{\ displaystyle [X ^ {j}, X ^ {k}] ^ {c}}
X
j
{\ displaystyle X ^ {j}}
X
k
{\ displaystyle X ^ {k}}
X
{\ displaystyle X}
[
X
j
,
X
k
]
c
=
[
X
j
,
X
k
]
{\ displaystyle [X ^ {j}, X ^ {k}] ^ {c} = [X ^ {j}, X ^ {k}]}
Examples
For true .
Y
t
=
sin
(
W.
t
)
{\ displaystyle Y_ {t} = \ sin (W_ {t})}
d
Y
t
=
cos
(
W.
t
)
d
W.
t
-
1
2
sin
(
W.
t
)
d
t
{\ displaystyle {\ rm {d}} Y_ {t} = \ cos (W_ {t}) \, {\ rm {d}} W_ {t} - {\ tfrac {1} {2}} \ sin ( W_ {t}) \, {\ rm {d}} t}
S.
t
=
S.
0
e
r
t
-
1
2
σ
2
t
+
σ
W.
t
{\ displaystyle S_ {t} = S_ {0} e ^ {rt - {\ frac {1} {2}} \ sigma ^ {2} t + \ sigma W_ {t}}}
a solution to the stochastic differential equation of Black and Scholes
d
S.
t
=
r
S.
t
d
t
+
σ
S.
t
d
W.
t
{\ displaystyle {\ rm {d}} S_ {t} = rS_ {t} \, {\ rm {d}} t + \ sigma S_ {t} \, {\ rm {d}} W_ {t}}
is.
For this you choose , so .
X
t
=
W.
t
{\ displaystyle X_ {t} = W_ {t}}
a
t
=
0
,
b
t
=
1
{\ displaystyle a_ {t} = 0, \; b_ {t} = 1}
Then the lemma yields :
H
(
t
,
x
)
=
S.
0
e
r
t
-
1
2
σ
2
t
+
σ
x
{\ displaystyle h (t, x) = S_ {0} e ^ {rt - {\ frac {1} {2}} \ sigma ^ {2} t + \ sigma x}}
d
S.
t
=
[
(
r
-
σ
2
2
+
σ
2
2
)
S.
0
e
r
t
-
1
2
σ
2
t
+
σ
W.
t
]
d
t
+
[
σ
S.
0
e
r
t
-
1
2
σ
2
t
+
σ
W.
t
]
d
W.
t
=
r
S.
t
d
t
+
σ
S.
t
d
W.
t
.
{\ displaystyle {\ rm {d}} S_ {t} = \ left [\ left (r - {\ frac {\ sigma ^ {2}} {2}} + {\ frac {\ sigma ^ {2}} {2}} \ right) S_ {0} e ^ {rt - {\ frac {1} {2}} \ sigma ^ {2} t + \ sigma W_ {t}} \ right] {\ rm {d}} t + \ left [\ sigma S_ {0} e ^ {rt - {\ frac {1} {2}} \ sigma ^ {2} t + \ sigma W_ {t}} \ right] {\ rm {d}} W_ {t} = rS_ {t} \, {\ rm {d}} t + \ sigma S_ {t} \, {\ rm {d}} W_ {t} \ ,.}
If a -dimensional Wiener process and twice continuously differentiable, then holds for
(
W.
t
)
t
≥
0
{\ displaystyle (\ mathbf {W} _ {t}) _ {t \ geq 0}}
d
{\ displaystyle d}
F.
:
R.
d
→
R.
{\ displaystyle F \ colon \ mathbb {R} ^ {d} \ to \ mathbb {R}}
Y
t
=
F.
(
W.
t
)
{\ displaystyle Y_ {t} = F (\ mathbf {W} _ {t})}
d
Y
t
=
∇
F.
(
W.
t
)
T
⋅
d
W.
t
+
1
2
Δ
F.
(
W.
t
)
d
t
{\ displaystyle \ mathrm {d} Y_ {t} = \ nabla F (\ mathbf {W} _ {t}) ^ {\ mathsf {T}} \ cdot \ mathrm {d} \ mathbf {W} _ {t } + {\ frac {1} {2}} \ Delta F (\ mathbf {W} _ {t}) \, \ mathrm {d} t}
,
where denote the gradient and the Laplace operator of .
∇
F.
{\ displaystyle \ nabla F}
Δ
F.
{\ displaystyle \ Delta F}
F.
{\ displaystyle F}
literature
Philip E. Protter: Stochastic Integration and Differential Equations (2nd edition), Springer, 2004, ISBN 3-540-00313-4 .
Individual evidence
^ Hui-Hsiung Kuo: Introduction to Stochastic Integration. Springer, 2006, ISBN 978-0387-28720-1 , p. 103 ( limited preview in the Google book search).
<img src="https://de.wikipedia.org/wiki/Special:CentralAutoLogin/start?type=1x1" alt="" title="" width="1" height="1" style="border: none; position: absolute;">