The subdifferential is a generalization of the gradient to non- differentiable convex functions . The subdifferential plays an important role in convex analysis as well as convex optimization .
definition
Let be a convex function. A vector is called a subgradient of at the point if holds
for all
f
:
R.
n
→
R.
{\ displaystyle f \ colon \ mathbb {R} ^ {n} \ to \ mathbb {R}}
G
∈
R.
n
{\ displaystyle g \ in \ mathbb {R} ^ {n}}
f
{\ displaystyle f}
x
0
{\ displaystyle x_ {0}}
x
∈
R.
n
{\ displaystyle x \ in \ mathbb {R} ^ {n}}
f
(
x
)
≥
f
(
x
0
)
+
⟨
G
,
x
-
x
0
⟩
{\ displaystyle f (x) \ geq f (x_ {0}) + \ langle g, x-x_ {0} \ rangle}
,
where denotes the standard scalar product.
⟨
⋅
,
⋅
⟩
{\ displaystyle \ langle \ cdot, \ cdot \ rangle}
The sub- differential is the set of all sub-gradients in the point .
∂
f
(
x
0
)
{\ displaystyle \ partial f (x_ {0})}
f
{\ displaystyle f}
x
0
{\ displaystyle x_ {0}}
Intuition
Subgradients of a convex function
Intuitively, this definition means that the graph of the function lies everywhere above the straight line
that goes through the point and has the slope :
n
=
1
{\ displaystyle n = 1}
f
{\ displaystyle f}
G
{\ displaystyle G}
(
x
0
,
f
(
x
0
)
)
{\ displaystyle (x_ {0}, f (x_ {0}))}
G
{\ displaystyle g}
G
=
{
(
x
,
y
)
∈
R.
2
∣
y
=
G
⋅
(
x
-
x
0
)
+
f
(
x
0
)
}
{\ displaystyle G = \ {(x, y) \ in \ mathbb {R} ^ {2} \ mid y = g \ cdot (x-x_ {0}) + f (x_ {0}) \}}
Since the normal equation of even
G
{\ displaystyle G}
-
G
⋅
(
x
-
x
0
)
+
1
⋅
(
y
-
f
(
x
0
)
)
=
0
{\ displaystyle -g \ cdot (x-x_ {0}) + 1 \ cdot (yf (x_ {0})) = 0}
is, the normal is on so
G
{\ displaystyle G}
(
-
G
,
1
)
∈
R.
2
{\ displaystyle (-g, 1) \ in \ mathbb {R} ^ {2}}
In the general case lies above the hyperplane given by the foot point and the normal .
n
≥
1
{\ displaystyle n \ geq 1}
f
{\ displaystyle f}
(
x
0
,
f
(
x
0
)
)
{\ displaystyle (x_ {0}, f (x_ {0}))}
(
-
G
,
1
)
∈
R.
n
+
1
{\ displaystyle (-g, 1) \ in \ mathbb {R} ^ {n + 1}}
Because of the separation theorem , the subdifferential of a continuous convex function is not empty anywhere.
example
The subdifferential of the function , is given by:
f
:
R.
→
R.
{\ displaystyle f \ colon \ mathbb {R} \ rightarrow \ mathbb {R}}
x
↦
|
x
|
{\ displaystyle x \ mapsto | x |}
∂
f
(
x
0
)
=
{
{
-
1
}
x
0
<
0
[
-
1
,
1
]
x
0
=
0
{
1
}
x
0
>
0
{\ displaystyle \ partial f (x_ {0}) = {\ begin {cases} \ {- 1 \} & x_ {0} <0 \\\ left [-1.1 \ right] & x_ {0} = 0 \ \\ {1 \} & x_ {0}> 0 \ end {cases}}}
Narrow-mindedness
Be continuous and be limited. Then the amount is limited.
f
:
R.
n
→
R.
{\ displaystyle f \ colon \ mathbb {R} ^ {n} \ rightarrow \ mathbb {R}}
X
⊂
R.
n
{\ displaystyle X \ subset \ mathbb {R} ^ {n}}
⋃
x
0
∈
X
∂
f
(
x
0
)
{\ displaystyle \ bigcup _ {x_ {0} \ in X} \ partial f (x_ {0})}
proof
Be continuous and be limited. Put
where . Adopted is not restricted, then there is for
one and one with . Be . So are . We get the estimate
f
:
R.
n
→
R.
{\ displaystyle f \ colon \ mathbb {R} ^ {n} \ rightarrow \ mathbb {R}}
X
⊂
R.
n
{\ displaystyle X \ subset \ mathbb {R} ^ {n}}
ε
: =
sup
|
f
(
U
1
(
X
)
¯
)
|
{\ displaystyle \ varepsilon: = \ sup | f ({\ overline {U_ {1} (X)}}) |}
U
1
(
X
)
¯
=
{
x
∈
R.
n
∣
d
i
s
t
(
x
,
X
)
≤
1
}
{\ displaystyle {\ overline {U_ {1} (X)}} = \ {x \ in \ mathbb {R} ^ {n} \ mid {\ rm {dist}} (x, X) \ leq 1 \} }
⋃
x
0
∈
X
∂
f
(
x
0
)
{\ displaystyle \ bigcup _ {x_ {0} \ in X} \ partial f (x_ {0})}
R.
: =
2
ε
{\ displaystyle R: = 2 \ varepsilon}
x
0
∈
X
{\ displaystyle x_ {0} \ in X}
G
∈
∂
f
(
x
0
)
{\ displaystyle g \ in \ partial f (x_ {0})}
‖
G
‖
2
>
R.
=
2
ε
{\ displaystyle \ | g \ | _ {2}> R = 2 \ varepsilon}
x
: =
1
‖
G
‖
2
G
+
x
0
{\ displaystyle x: = {\ frac {1} {\ | g \ | _ {2}}} g + x_ {0}}
x
0
,
x
∈
U
1
(
X
)
¯
{\ displaystyle x_ {0}, x \ in {\ overline {U_ {1} (X)}}}
G
T
(
x
-
x
0
)
=
1
‖
G
‖
2
G
T
G
=
‖
G
‖
2
>
2
ε
≥
|
f
(
x
)
-
f
(
x
0
)
|
≥
f
(
x
)
-
f
(
x
0
)
{\ displaystyle g ^ {T} (x-x_ {0}) = {\ frac {1} {\ | g \ | _ {2}}} g ^ {T} g = \ | g \ | _ {2 }> 2 \ varepsilon \ geq \ left | f (x) -f (x_ {0}) \ right | \ geq f (x) -f (x_ {0})}
.
G
{\ displaystyle g}
so is not a subgradient. That is a contradiction.
◻
{\ displaystyle \ Box}
literature
^ RT Rockafellar Convex analysis 1970., p.214
^ RT Rockafellar Convex analysis 1970., p.215
<img src="https://de.wikipedia.org/wiki/Special:CentralAutoLogin/start?type=1x1" alt="" title="" width="1" height="1" style="border: none; position: absolute;">