Fritz John Conditions
The Fritz-John conditions (abbreviated to FJ conditions ) are a necessary first-order optimality criterion in mathematics in non-linear optimization . They are a generalization of the Karush-Kuhn-Tucker conditions and, in contrast to these, do not have any regularity conditions. They are named after the American mathematician of German descent, Fritz John .
Framework
The Fritz-John-Conditions allow statements about an optimization problem of the form
under the constraints
- .
All functions considered are continuously differentiable and is a non-empty subset of the .
statement
A point is called the Fritz-John point or FJ point for short in the above optimization problem if it meets the following conditions:
These conditions are called the Fritz-John-Conditions or FJ-Conditions for short .
If the point is the local minimum of the optimization problem, then there is such that is an FJ point and is not equal to the zero vector.
Thus the FJ conditions are a necessary first order optimality criterion .
Relationship to the Karush-Kuhn-Tucker Terms
For the FJ conditions correspond exactly to the Karush-Kuhn-Tucker conditions . Is a FJ-point, so also with a FJ-point. It can therefore be assumed that if is, a KKT point is already present; this is also generated by rescaling . Then the KKT point belonging to an FJ point is. Conversely, the constraint qualifications of the KKT conditions can now be interpreted in such a way that they guarantee the FJ conditions .
Examples
FJ without KKT
As an example, consider the optimization problem
with restriction amount
- .
The minimum of the problem is the point . Hence there is an FJ point such that
- .
It follows directly that for an FJ point.
In particular, there is no associated KKT point. If one sets , the system of equations for the gradients cannot be solved. In fact, no regularity condition is fulfilled in the point , especially not the most general one, the Abadie CQ .
FJ and KKT
As an example, consider the optimization problem
with restriction amount
- .
The restriction set is the unit circle with the curvature of the circle removed from the first quadrant. The minimum of the problem is the point . Hence there is an FJ point so that
applies. One solution would be what leads to the FJ point . Rescaling with leads to the KKT point . In fact, the LICQ is also fulfilled in this point , which is why the KKT conditions also apply here.
Related concepts
For convex optimization problems, in which the functions are not continuously differentiable, there are the saddle point criteria of the Lagrange function . If all the functions involved are continuously differentiable, then they are structurally similar to the Fritz-John conditions and equivalent to the KKT conditions.
literature
- Florian Jarre, Josef Stoer: Optimization. Springer, Berlin 2004, ISBN 3-540-43575-1 .
- C. Geiger, C. Kanzow: Theory and numerics of restricted optimization tasks . Springer, 2002, ISBN 3-540-42790-2 , books.google.de
Web links
- Stephen Boyd, Lieven Vandenberghe: Convex Optimization . (PDF; English)
Individual evidence
- ↑ F. John: Extremum problems with inequalities as subsidiary conditions . In: Kurt Friedrichs, Otto Neugebauer, JJ Stoker (Ed.): Studies and Essays . Courant Anniversary Volume, Wiley, 1948, pp. 187-204, reprinted in: Fritz John: Collected Papers . Birkhäuser 1985, pp. 543-560