Login Form

Visitors Counter

mod_vvisit_counterThis week153
mod_vvisit_counterThis month1803

Who's Online

We have 26 guests online

  • Error loading feed data.
  • Error loading feed data.
A descent hybrid conjugate gradient method based on the memoryless BFGS update PDF Print E-mail

I.E. Livieris, V. Tampakas and P. Pintelas. A descent hybrid conjugate gradient method based on the memoryless BFGS update. Numerical Algorithms, 2018.



Abstract - In this work, we present a new hybrid conjugate gradient method based on the approach of the convex hybridization of the conjugate gradient update parameters of DY and HS+, adapting a quasi-Newton philosophy. The computation of the hybrization parameter is obtained by minimizing the distance between the hybrid conjugate gradient direction and the self-scaling memoryless BFGS direction. Furthermore, a significant property of our proposed method is that it ensures sufficient descent independent of the accuracy of the line search. The global convergence of the proposed method is established provided that the line search satisfies the Wolfe conditions. Our numerical experiments on a set of unconstrained optimization test problems from the CUTEr collection, indicate that our proposed method is preferable and in general superior to classic conjugate gradient methods in terms of efficiency and robustness.


Search Engines


Department of Mathematics

Educational Software News

Call for papers

Newest Education Titles


Designed by Ioannis E. Livieris. | Validate XHTML | CSS