Login Form

Visitors Counter

mod_vvisit_counterThis week449
mod_vvisit_counterThis month2260

Who's Online

We have 13 guests online

Home Members Ioannis E. Livieris An advanced conjugate gradient training algorithm based on a modified secant equation
  • Error loading feed data.
  • Error loading feed data.
An advanced conjugate gradient training algorithm based on a modified secant equation PDF Print E-mail

I.E. Livieris and P. Pintelas, An Advanced Conjugate Gradient Training Algorithm Based on a Modified Secant Equation, ISRN Artificial Intelligence, 2012.


Abstract - Conjugate gradient methods constitute excellent neural network training methods, characterized by their simplicity, numerical efficiency and their very low memory requirements. In this paper, we propose a conjugate gradient neural network training algorithm which guarantees sufficient descent using any line search, avoiding thereby the usually inefficient restarts. Moreover, it achieves a high-order accuracy in approximating the second order curvature information of the error surface by utilizing the modified secant condition proposed by Li et al. (J. Comput. Appl. Math. 202(2):523--539, 2007). Under mild conditions, we establish that the proposed method is globally convergent for general functions under the strong Wolfe conditions. Experimental results provide evidence that our proposed method is preferable and in general superior to the classical conjugate gradient methods has a potential to significantly enhance the computational efficiency and robustness of the training process.



Search Engines


Department of Mathematics

Educational Software News

Call for papers

Newest Education Titles


Designed by Ioannis E. Livieris. | Validate XHTML | CSS