First and zeroth-order implementations of the regularized Newton method with lazy approximated Hessians

09/05/2023
by   Nikita Doikov, et al.
0

In this work, we develop first-order (Hessian-free) and zero-order (derivative-free) implementations of the Cubically regularized Newton method for solving general non-convex optimization problems. For that, we employ finite difference approximations of the derivatives. We use a special adaptive search procedure in our algorithms, which simultaneously fits both the regularization constant and the parameters of the finite difference approximations. It makes our schemes free from the need to know the actual Lipschitz constants. Additionally, we equip our algorithms with the lazy Hessian update that reuse a previously computed Hessian approximation matrix for several iterations. Specifically, we prove the global complexity bound of 𝒪( n^1/2ϵ^-3/2) function and gradient evaluations for our new Hessian-free method, and a bound of 𝒪( n^3/2ϵ^-3/2 ) function evaluations for the derivative-free method, where n is the dimension of the problem and ϵ is the desired accuracy for the gradient norm. These complexity bounds significantly improve the previously known ones in terms of the joint dependence on n and ϵ, for the first-order and zeroth-order non-convex optimization.

READ FULL TEXT
research
12/01/2022

Second-order optimization with lazy Hessians

We analyze Newton's method with lazy Hessian updates for solving general...
research
04/06/2023

A matrix algebra approach to approximate Hessians

This work presents a novel matrix-based method for constructing an appro...
research
08/23/2017

Newton-Type Methods for Non-Convex Optimization Under Inexact Hessian Information

We consider variants of trust-region and cubic regularization methods fo...
research
01/26/2022

Towards Sharp Stochastic Zeroth Order Hessian Estimators over Riemannian Manifolds

We study Hessian estimators for real-valued functions defined over an n-...
research
08/11/2022

Super-Universal Regularized Newton Method

We analyze the performance of a variant of Newton method with quadratic ...
research
01/29/2020

Complexity Analysis of a Stochastic Cubic Regularisation Method under Inexact Gradient Evaluations and Dynamic Hessian Accuracy

We here adapt an extended version of the adaptive cubic regularisation m...
research
05/30/2022

Optimal and Adaptive Monteiro-Svaiter Acceleration

We develop a variant of the Monteiro-Svaiter (MS) acceleration framework...

Please sign up or login with your details

Forgot password? Click here to reset