Combining Stochastic Adaptive Cubic Regularization with Negative Curvature for Nonconvex Optimization

06/27/2019
by   Seonho Park, et al.
0

We focus on minimizing nonconvex finite-sum functions that typically arise in machine learning problems. In an attempt to solve this problem, the adaptive cubic regularized Newton method has shown its strong global convergence guarantees and ability to escape from strict saddle points. This method uses a trust region-like scheme to determine if an iteration is successful or not, and updates only when it is successful. In this paper, we suggest an algorithm combining negative curvature with the adaptive cubic regularized Newton method to update even at unsuccessful iterations. We call this new method Stochastic Adaptive cubic regularization with Negative Curvature (SANC). Unlike the previous method, in order to attain stochastic gradient and Hessian estimators, the SANC algorithm uses independent sets of data points of consistent size over all iterations. It makes the SANC algorithm more practical to apply for solving large-scale machine learning problems. To the best of our knowledge, this is the first approach that combines the negative curvature method with the adaptive cubic regularized Newton method. Finally, we provide experimental results including neural networks problems supporting the efficiency of our method.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/16/2017

Sub-sampled Cubic Regularization for Non-convex Optimization

We consider the minimization of non-convex functions that typically aris...
research
06/25/2023

Regularized methods via cubic subspace minimization for nonconvex optimization

The main computational cost per iteration of adaptive cubic regularizati...
research
12/26/2018

Stochastic Trust Region Inexact Newton Method for Large-scale Machine Learning

Nowadays stochastic approximation methods are one of the major research ...
research
04/19/2022

A Novel Fast Exact Subproblem Solver for Stochastic Quasi-Newton Cubic Regularized Optimization

In this work we describe an Adaptive Regularization using Cubics (ARC) m...
research
10/12/2021

Regularized Step Directions in Conjugate Gradient Minimization for Machine Learning

Conjugate gradient minimization methods (CGM) and their accelerated vari...
research
08/22/2018

A Note on Inexact Condition for Cubic Regularized Newton's Method

This note considers the inexact cubic-regularized Newton's method (CR), ...
research
12/10/2019

A Stochastic Quasi-Newton Method for Large-Scale Nonconvex Optimization with Applications

This paper proposes a novel stochastic version of damped and regularized...

Please sign up or login with your details

Forgot password? Click here to reset