SC-Reg: Training Overparameterized Neural Networks under Self-Concordant Regularization

12/14/2021
by   Adeyemi D. Adeoye, et al.
0

In this paper we propose the SC-Reg (self-concordant regularization) framework for learning overparameterized feedforward neural networks by incorporating second-order information in the Newton decrement framework for convex problems. We propose the generalized Gauss-Newton with Self-Concordant Regularization (SCoRe-GGN) algorithm that updates the network parameters each time it receives a new input batch. The proposed algorithm exploits the structure of the second-order information in the Hessian matrix, thereby reducing the training computational overhead. Although our current analysis considers only the convex case, numerical experiments show the efficiency of our method and its fast convergence under both convex and non-convex settings, which compare favorably against baseline first-order methods and a quasi-Newton method.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset