The aim of this paper is to present a novel, transparent approach to a well-established field: the deep methods and applications of the complete analysis of continuous optimization problems. Standard descents give a unified approach to all standard necessary conditions, including the Lagrange multiplier rule, the Karush–Kuhn–Tucker conditions and the second order conditions. Nonstandard descents lead to new necessary conditions. These can be used to give surprising proofs of deep central results of fields that are generally viewed as distinct from optimization: the fundamental theorem of algebra, the maximum and the minimum principle of complex function theory, the separation theorems for convex sets, the orthogonal diagonalization of symmetric matrices and the implicit function theorem. These optimization proofs compare favorably with the usual proofs and are all based on the same strategy. This paper is addressed to all practitioners of optimization methods from many fields who are interested in fully understanding the foundations of these methods and of the central results above.

, , , , ,
doi.org/10.1016/j.ejor.2006.06.008, hdl.handle.net/1765/19260
ERIM Top-Core Articles , Econometric Institute Reprint Series
European Journal of Operational Research
Erasmus Research Institute of Management

Brinkhuis, J. (2007). Descent: An optimization point of view on different fields. European Journal of Operational Research, 181(1), 10–19. doi:10.1016/j.ejor.2006.06.008