top
logo

Login Form



Visitors Counter

mod_vvisit_counterToday29
mod_vvisit_counterYesterday73
mod_vvisit_counterThis week524
mod_vvisit_counterThis month1637
mod_vvisit_counterAll159437

Who's Online

We have 16 guests online

Home Members Ioannis E. Livieris An advanced conjugate gradient training algorithm based on a modified secant equation
Error
  • Error loading feed data.
  • Error loading feed data.
An advanced conjugate gradient training algorithm based on a modified secant equation PDF Print E-mail

I.E. Livieris and P. Pintelas, An Advanced Conjugate Gradient Training Algorithm Based on a Modified Secant Equation, ISRN Artificial Intelligence, 2012.

 

Abstract - Conjugate gradient methods constitute excellent neural network training methods, characterized by their simplicity, numerical efficiency and their very low memory requirements. In this paper, we propose a conjugate gradient neural network training algorithm which guarantees sufficient descent using any line search, avoiding thereby the usually inefficient restarts. Moreover, it achieves a high-order accuracy in approximating the second order curvature information of the error surface by utilizing the modified secant condition proposed by Li et al. (J. Comput. Appl. Math. 202(2):523--539, 2007). Under mild conditions, we establish that the proposed method is globally convergent for general functions under the strong Wolfe conditions. Experimental results provide evidence that our proposed method is preferable and in general superior to the classical conjugate gradient methods has a potential to significantly enhance the computational efficiency and robustness of the training process.

 

 

Search Engines




bottom
top

Department of Mathematics

Educational Software News

Call for papers

Newest Education Titles


bottom

Designed by Ioannis E. Livieris. | Validate XHTML | CSS