Login Form

Visitors Counter

mod_vvisit_counterThis week327
mod_vvisit_counterThis month982

Who's Online

We have 4 guests online

  • Error loading feed data.
  • Error loading feed data.
A modified Perry conjugate gradient method and its global convergence PDF Print E-mail
I.E. Livieris and P. Pintelas, A modified Perry conjugate gradient method and its global convergence. Optimization Letters, Volume 218, Number 18, p.p.

9197-9207, 2014.

Abstract - In this paper, we propose a new class of conjugate gradient algorithms for training neural networks which is based on a new modified nonmonotone scheme proposed by Shi and Wang. The utilization of a nonmonotone strategy enables the training algorithm to overcome the case where the sequence of iterates runs into the bottom of a curved narrow valley, a common occurrence in neural network training process. Our proposed class of methods ensures sufficient descent, avoiding thereby the usual inefficient restarts and it is globally convergent under mild conditions. Our experimental results provide evidence that the proposed nonmonotone conjugate gradient training methods are efficient, outperforming classical methods, proving more stable, efficient and reliable learning.


Search Engines


Department of Mathematics

Educational Software News

Call for papers

Newest Education Titles


Designed by Ioannis E. Livieris. | Validate XHTML | CSS