top
logo

Login Form



Visitors Counter

mod_vvisit_counterToday38
mod_vvisit_counterYesterday105
mod_vvisit_counterThis week143
mod_vvisit_counterThis month1793
mod_vvisit_counterAll123890

Who's Online

We have 23 guests online

Home
Error
  • Error loading feed data.
  • Error loading feed data.
Performance evaluation of descent CG methods for neural network training PDF Print E-mail

I.E. Livieris and P. Pintelas, Performance Evaluation of Descent CG Methods for Neural Network Training, In Proceedings of The 9th Hellenic European Research on Computer Mathematics & Conference its Applications (HERCMA 2009), vol 11, pp. 40-46, Athens, 2009.

 

Abstract - Conjugate gradient methods constitute an excellent choice for efficiently training large neural networks since they don't  require he evaluation of the Hessian neither the impractical storage of an approximation of it. Despite the theoretical and practical advantages  of these methods their main drawback is the use of restarting procedures in order to guarantee convergence, abandoning second order derivative information. In this work, we evaluate the performance of a new class of conjugate gradient methods and we propose a new algorithm for training neural networks. The presented algorithm preserves the advantages of classical conjugate gradient methods and simultaneously avoids the inefficient restarts. Encouraging numerical experiments verify that the presented algorithm provides fast, stable and reliable convergence.

 

 

 

Search Engines




bottom
top

Department of Mathematics

Educational Software News

Call for papers

Newest Education Titles


bottom

Designed by Ioannis E. Livieris. | Validate XHTML | CSS