Line search method
Among the line search method (English: line search algorithms ), also general descent method with directional search called, refers to the optimization of a series of iterative methods to the local minimum of a function to be found. The basic idea is to minimize the function in each step along a certain direction. Examples of algorithms that can be assigned to the line search method are the gradient method (method of the steepest descent) and the Newton method .
description
Common to all line search methods is the following basic procedure for calculating a new intermediate result from an existing intermediate result :
- Determine a search direction in the form of a vector .
- Calculate a step size by minimizing the one-dimensional auxiliary function with reference to . This step is also called a line search .
- Calculate the new solution
The minimum of the auxiliary function from step 2 is usually not determined exactly. If it does, one speaks of an exact line search.
Individual evidence
- ↑ Rüdiger Reinhardt, Armin Hoffmann, Tobias Gerlach: Nonlinear Optimization . Springer-Verlag, Berlin Heidelberg 2013, chap. 3.3, doi : 10.1007 / 978-3-8274-2949-0 .
- ↑ a b Josef Kallrath: Mixed-integer optimization: Modeling in practice . 2nd Edition. Springer Fachmedien Wiesbaden , 2013, p. 107 , doi : 10.1007 / 978-3-658-00690-7 .
- ^ Markos Papageorgiou, Marion Leibold, Martin Buss: Optimization . 4th edition. Springer-Verlag , Berlin Heidelberg 2015, p. 43 ff ., doi : 10.1007 / 978-3-662-46936-1 .