This book presents basic optimization principles and gradient-based algorithms to a general audience in a brief and easy-to-read form, without neglecting rigor. The work should enable professionals to apply optimization theory and algorithms to their own particular practical fields of interest, be it engineering, physics, chemistry, or business economics. Most importantly, for the first time in a relatively brief and introductory work, due attention is paid to the difficulties – such as noise, discontinuities, expense of function evaluations, and the existence of multiple minima – that often unnecessarily inhibit the use of gradient-based methods. In a separate chapter on new gradient-based methods developed by the author and his coworkers, it is shown how these difficulties may be overcome without losing the desirable features of classical gradient-based methods.
Preface
Table of Notation
Chapter 1. Introduction
Chapter 2. Line Search Descent Methods for Unconstrained Minimization
Chapter 3. Standard Methods for Constrained Optimization
Chapter 4. New Gradient-Based Trajectory and Approximation Methods
Chapter 5. Example Problems
Chapter 6. Some Theorems
Chapter 7. The Simplex Method for Linear Programming Problems
Bibliography
Index
© 2008-2025 Fundación Dialnet · Todos los derechos reservados