Optimality criterion

from Wikipedia, the free encyclopedia

Optimality criteria play an important role in mathematical optimization . They are categorized according to their strength and the necessary requirements and used to find possible optimal points of a problem ( necessary criteria) or to decide whether a point found is actually an optimal point (sufficient criteria).

definition

Let be a feasible point of an optimization problem and a certain criterion. is called the necessary optimality criterion for a certain class of problems if the following statement applies:

.

or in the negative form

is called a sufficient optimality criterion if the following statement applies:

or in the negative form

.

An optimality criterion is called an optimality criterion of the first order (also first order condition or short BeO; English first order condition or short FOC) if it makes demands on the first derivatives of the occurring functions. Accordingly, a second order optimality criterion (also second-order condition or short-BzO or B.zw.O., English second order condition or shortly SOC) represents one, the requirements for the second derivatives. In some cases, requirements are also placed on higher discharges.

It should be noted that what exactly "optimal" means has not yet been specified. This can be maximal, minimal or also globally or locally optimal.

Examples

First order necessary

A typical example of a necessary first-order optimality criterion can be found in unrestricted optimization. Takes a continuously differentiable function in a point , a local minimum , so the derivative vanishes at this point: . The problem class in this case would be the finding of a minimum in continuously differentiable functions , the concept of optimality that of the local minimum.

Sufficient first order

A sufficient first order criterion can be found in the minimization of strictly convex functions . If the derivative disappears at a point, this point is a global minimum.

Second order necessary

Determining turning points uses necessary second order conditions. If there is a turning point, the second derivative disappears at this point.

Sufficiently second order

An example of this would be the determination of a local minimum of a twice continuously differentiable function. If the first derivative is zero and the second derivative is really greater than zero, then it is a local minimum.

Important criteria of optimality

One of the most important optimality criteria in nonlinear optimization are the Karush-Kuhn-Tucker conditions . In the general case they are a necessary first order criterion. However, they still need certain regularity requirements such as the Abadie CQ , the MFCQ or the LICQ to be valid .

If the problem posed is convex , then the Karush-Kuhn-Tucker conditions are also sufficient for optimality. A somewhat weaker necessary optimality criterion are the Fritz-John conditions ; these do not require additional regularity assumptions.

literature