Newton-Type Methods for Non-Convex Optimization Under Inexact Hessian Information

08/23/2017
by   Peng Xu, et al.
0

We consider variants of trust-region and cubic regularization methods for non-convex optimization, in which the Hessian matrix is approximated. Under mild conditions on the inexact Hessian, and using approximate solution of the corresponding sub-problems, we provide iteration complexity to achieve ϵ -approximate second-order optimality which have shown to be tight. Our Hessian approximation conditions constitute a major relaxation over the existing ones in the literature. Consequently, we are able to show that such mild conditions allow for the construction of the approximate Hessian through various random sampling methods. In this light, we consider the canonical problem of finite-sum minimization, provide appropriate uniform and non-uniform sub-sampling strategies to construct such Hessian approximations, and obtain optimal iteration complexity for the corresponding sub-sampled trust-region and cubic regularization methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/26/2018

Stochastic Second-order Methods for Non-convex Optimization with Inexact Hessian and Gradient

Trust region and cubic regularization methods have demonstrated good per...
research
05/16/2017

Sub-sampled Cubic Regularization for Non-convex Optimization

We consider the minimization of non-convex functions that typically aris...
research
03/04/2019

A Stochastic Trust Region Method for Non-convex Minimization

We target the problem of finding a local minimum in non-convex finite-su...
research
09/27/2022

Approximate Secular Equations for the Cubic Regularization Subproblem

The cubic regularization method (CR) is a popular algorithm for unconstr...
research
09/05/2023

First and zeroth-order implementations of the regularized Newton method with lazy approximated Hessians

In this work, we develop first-order (Hessian-free) and zero-order (deri...
research
08/19/2023

Complexity Guarantees for Nonconvex Newton-MR Under Inexact Hessian Information

We consider extensions of the Newton-MR algorithm for nonconvex optimiza...
research
02/12/2020

A Random-Feature Based Newton Method for Empirical Risk Minimization in Reproducing Kernel Hilbert Space

In supervised learning using kernel methods, we encounter a large-scale ...

Please sign up or login with your details

Forgot password? Click here to reset