Second-order optimization with lazy Hessians

12/01/2022
by   Nikita Doikov, et al.
0

We analyze Newton's method with lazy Hessian updates for solving general possibly non-convex optimization problems. We propose to reuse a previously seen Hessian for several iterations while computing new gradients at each step of the method. This significantly reduces the overall arithmetical complexity of second-order optimization schemes. By using the cubic regularization technique, we establish fast global convergence of our method to a second-order stationary point, while the Hessian does not need to be updated each iteration. For convex problems, we justify global and local superlinear rates for lazy Newton steps with quadratic regularization, which is easier to compute. The optimal frequency for updating the Hessian is once every d iterations, where d is the dimension of the problem. This provably improves the total arithmetical complexity of second-order algorithms by a factor √(d).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/05/2023

First and zeroth-order implementations of the regularized Newton method with lazy approximated Hessians

In this work, we develop first-order (Hessian-free) and zero-order (deri...
research
12/14/2021

SC-Reg: Training Overparameterized Neural Networks under Self-Concordant Regularization

In this paper we propose the SC-Reg (self-concordant regularization) fra...
research
05/26/2022

Faster Optimization on Sparse Graphs via Neural Reparametrization

In mathematical optimization, second-order Newton's methods generally co...
research
12/08/2021

Learning Linear Models Using Distributed Iterative Hessian Sketching

This work considers the problem of learning the Markov parameters of a l...
research
02/24/2021

Learning-Augmented Sketches for Hessians

Sketching is a dimensionality reduction technique where one compresses a...
research
02/12/2021

Newton Method over Networks is Fast up to the Statistical Precision

We propose a distributed cubic regularization of the Newton method for s...
research
04/06/2022

A Hessian inversion-free exact second order method for distributed consensus optimization

We consider a standard distributed consensus optimization problem where ...

Please sign up or login with your details

Forgot password? Click here to reset