Stochastic Second-order Methods for Non-convex Optimization with Inexact Hessian and Gradient

09/26/2018
by   Liu Liu, et al.
0

Trust region and cubic regularization methods have demonstrated good performance in small scale non-convex optimization, showing the ability to escape from saddle points. Each iteration of these methods involves computation of gradient, Hessian and function value in order to obtain the search direction and adjust the radius or cubic regularization parameter. However, exactly computing those quantities are too expensive in large-scale problems such as training deep networks. In this paper, we study a family of stochastic trust region and cubic regularization methods when gradient, Hessian and function values are computed inexactly, and show the iteration complexity to achieve ϵ-approximate second-order optimality is in the same order with previous work for which gradient and function values are computed exactly. The mild conditions on inexactness can be achieved in finite-sum minimization using random sampling. We show the algorithm performs well on training convolutional neural networks compared with previous second-order methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/23/2017

Newton-Type Methods for Non-Convex Optimization Under Inexact Hessian Information

We consider variants of trust-region and cubic regularization methods fo...
research
02/13/2018

Stochastic Variance-Reduced Cubic Regularized Newton Method

We propose a stochastic variance-reduced cubic regularized Newton method...
research
05/16/2017

Sub-sampled Cubic Regularization for Non-convex Optimization

We consider the minimization of non-convex functions that typically aris...
research
03/04/2019

A Stochastic Trust Region Method for Non-convex Minimization

We target the problem of finding a local minimum in non-convex finite-su...
research
08/03/2022

Neural Nets with a Newton Conjugate Gradient Method on Multiple GPUs

Training deep neural networks consumes increasing computational resource...
research
05/07/2018

Implementation of Stochastic Quasi-Newton's Method in PyTorch

In this paper, we implement the Stochastic Damped LBFGS (SdLBFGS) for st...
research
09/27/2022

Approximate Secular Equations for the Cubic Regularization Subproblem

The cubic regularization method (CR) is a popular algorithm for unconstr...

Please sign up or login with your details

Forgot password? Click here to reset