Sub-sampled Cubic Regularization for Non-convex Optimization

05/16/2017
by   Jonas Moritz Kohler, et al.
0

We consider the minimization of non-convex functions that typically arise in machine learning. Specifically, we focus our attention on a variant of trust region methods known as cubic regularization. This approach is particularly attractive because it escapes strict saddle points and it provides stronger convergence guarantees than first- and second-order as well as classical trust region methods. However, it suffers from a high computational complexity that makes it impractical for large-scale learning. Here, we propose a novel method that uses sub-sampling to lower this computational cost. By the use of concentration inequalities we provide a sampling scheme that gives sufficiently accurate gradient and Hessian approximations to retain the strong global and local convergence guarantees of cubically regularized methods. To the best of our knowledge this is the first work that gives global convergence guarantees for a sub-sampled variant of cubic regularization on non-convex functions. Furthermore, we provide experimental results supporting our theory.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/23/2017

Newton-Type Methods for Non-Convex Optimization Under Inexact Hessian Information

We consider variants of trust-region and cubic regularization methods fo...
research
06/27/2019

Combining Stochastic Adaptive Cubic Regularization with Negative Curvature for Nonconvex Optimization

We focus on minimizing nonconvex finite-sum functions that typically ari...
research
09/26/2018

Stochastic Second-order Methods for Non-convex Optimization with Inexact Hessian and Gradient

Trust region and cubic regularization methods have demonstrated good per...
research
10/12/2022

A Momentum Accelerated Adaptive Cubic Regularization Method for Nonconvex Optimization

The cubic regularization method (CR) and its adaptive version (ARC) are ...
research
02/22/2023

Faster Riemannian Newton-type Optimization by Subsampling and Cubic Regularization

This work is on constrained large-scale non-convex optimization where th...
research
11/23/2019

A Stochastic Tensor Method for Non-convex Optimization

We present a stochastic optimization method that uses a fourth-order reg...
research
08/10/2022

Moreau–Yosida regularization in DFT

Moreau-Yosida regularization is introduced into the framework of exact D...

Please sign up or login with your details

Forgot password? Click here to reset