Accelerating Symmetric Rank-1 Quasi-Newton Method with Nesterov’s Gradient for Training Neural Networks

Gradient-based methods are popularly used in training neural networks and can be broadly categorized into first and second order methods. Second order methods have shown to have better convergence compared to first order methods, especially in solving highly nonlinear problems. The BFGS quasi-Newton...

Full description

Bibliographic Details
Published in:Algorithms
Main Authors: S. Indrapriyadarsini, Shahrzad Mahboubi, Hiroshi Ninomiya, Takeshi Kamio, Hideki Asai
Format: Article
Language:English
Published: MDPI AG 2021-12-01
Subjects:
Online Access:https://www.mdpi.com/1999-4893/15/1/6