Yasushi Narushima, Hiroshi Yabe
Conjugate gradient methods have been paid attention to, because they can be directly applied to large-scale unconstrained optimization problems. In order to incorporate second order information of the objective function into conjugate gradient methods, Dai and Liao (2001) proposed a conjugate gradient method based on the secant condition. However, their method does not necessarily generate a descent search direction. On the other hand, Hager and Zhang (2005) proposed another conjugate gradient method which always generates a descent search direction.
In this paper, combining Dai�Liao�s idea and Hager�Zhang�s idea, we propose conjugate gradient methods based on secant conditions that generate descent search directions.
In addition, we prove global convergence properties of the proposed methods. Finally, preliminary numerical results are given.
© 2008-2024 Fundación Dialnet · Todos los derechos reservados