Line search method

from Wikipedia, the free encyclopedia

Among the line search method (English: line search algorithms ), also general descent method with directional search called, refers to the optimization of a series of iterative methods to the local minimum of a function to be found. The basic idea is to minimize the function in each step along a certain direction. Examples of algorithms that can be assigned to the line search method are the gradient method (method of the steepest descent) and the Newton method .

description

Common to all line search methods is the following basic procedure for calculating a new intermediate result from an existing intermediate result :

  1. Determine a search direction in the form of a vector .
  2. Calculate a step size by minimizing the one-dimensional auxiliary function with reference to . This step is also called a line search .
  3. Calculate the new solution

The minimum of the auxiliary function from step 2 is usually not determined exactly. If it does, one speaks of an exact line search.

Individual evidence

  1. Rüdiger Reinhardt, Armin Hoffmann, Tobias Gerlach: Nonlinear Optimization . Springer-Verlag, Berlin Heidelberg 2013, chap. 3.3, doi : 10.1007 / 978-3-8274-2949-0 .
  2. a b Josef Kallrath: Mixed-integer optimization: Modeling in practice . 2nd Edition. Springer Fachmedien Wiesbaden , 2013, p. 107 , doi : 10.1007 / 978-3-658-00690-7 .
  3. ^ Markos Papageorgiou, Marion Leibold, Martin Buss: Optimization . 4th edition. Springer-Verlag , Berlin Heidelberg 2015, p. 43 ff ., doi : 10.1007 / 978-3-662-46936-1 .