Ir al contenido

Documat


A simple three-term conjugate gradient algorithm for unconstrained optimization

  • Autores: Neculai Andrei
  • Localización: Journal of computational and applied mathematics, ISSN 0377-0427, Vol. 241, Nº 1, 2013, págs. 19-29
  • Idioma: inglés
  • DOI: 10.1016/j.cam.2012.10.002
  • Texto completo no disponible (Saber más ...)
  • Resumen
    • A simple three-term conjugate gradient algorithm which satisfies both the descent condition and the conjugacy condition is presented. This algorithm is a modification of the Hestenes and Stiefel algorithm (Hestenes and Stiefel, 1952) [10], or that of Hager and Zhang (Hager and Zhang, 2005) [23] in such a way that the search direction is descent and it satisfies the conjugacy condition. These properties are independent of the line search.

      Also, the algorithm could be considered as a modification of the memoryless BFGS quasi- Newton method. The new approximation of the minimum is obtained by the general Wolfe line search, now using a standard acceleration technique developed by Andrei (2009) [27].

      For uniformly convex functions, under standard assumptions, the global convergence of the algorithm is proved. Numerical comparisons of the suggested three-term conjugate gradient algorithm versus six other three-term conjugate gradient algorithms, using a set of 750 unconstrained optimization problems, show that all these computational schemes have similar performances, the suggested one being slightly faster and more robust. The proposed three-term conjugate gradient algorithm substantially outperforms the wellknown Hestenes and Stiefel conjugate gradient algorithm, as well as the more elaborate CG_DESCENT algorithm. Using five applications from the MINPACK-2 test problem collection (Averick et al., 1992) [25], with 106 variables, we show that the suggested threeterm conjugate gradient algorithm is the top performer versus CG_DESCENT.


Fundación Dialnet

Mi Documat

Opciones de artículo

Opciones de compartir

Opciones de entorno