Adaptive Quasi-Newton and Anderson Acceleration Framework with Explicit Global (Accelerated) Convergence Rates

05/30/2023
by   Damien Scieur, et al.
0

Despite the impressive numerical performance of quasi-Newton and Anderson/nonlinear acceleration methods, their global convergence rates have remained elusive for over 50 years. This paper addresses this long-standing question by introducing a framework that derives novel and adaptive quasi-Newton or nonlinear/Anderson acceleration schemes. Under mild assumptions, the proposed iterative methods exhibit explicit, non-asymptotic convergence rates that blend those of gradient descent and Cubic Regularized Newton's method. Notably, these rates are achieved adaptively, as the method autonomously determines the optimal step size using a simple backtracking strategy. The proposed approach also includes an accelerated version that improves the convergence rate on convex functions. Numerical experiments demonstrate the efficiency of the proposed framework, even compared to a fine-tuned BFGS algorithm with line search.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/11/2016

Proximal Quasi-Newton Methods for Regularized Convex Optimization with Linear and Accelerated Sublinear Convergence Rates

In [19], a general, inexact, efficient proximal quasi-Newton algorithm f...
research
10/21/2019

Implementation of a modified Nesterov's Accelerated quasi-Newton Method on Tensorflow

Recent studies incorporate Nesterov's accelerated gradient method for th...
research
11/25/2022

Nonlinear Schwarz preconditioning for Quasi-Newton methods

We propose the nonlinear restricted additive Schwarz (RAS) preconditioni...
research
08/12/2015

Convergence rates of sub-sampled Newton methods

We consider the problem of minimizing a sum of n functions over a convex...
research
11/26/2013

Practical Inexact Proximal Quasi-Newton Method with Global Complexity Analysis

Recently several methods were proposed for sparse optimization which mak...
research
10/12/2022

A Momentum Accelerated Adaptive Cubic Regularization Method for Nonconvex Optimization

The cubic regularization method (CR) and its adaptive version (ARC) are ...
research
10/22/2022

An Efficient Nonlinear Acceleration method that Exploits Symmetry of the Hessian

Nonlinear acceleration methods are powerful techniques to speed up fixed...

Please sign up or login with your details

Forgot password? Click here to reset