Non-asymptotic Superlinear Convergence of Standard Quasi-Newton Methods

03/30/2020
by   Qiujiang Jin, et al.
0

In this paper, we study the non-asymptotic superlinear convergence rate of DFP and BFGS, which are two well-known quasi-Newton methods. The asymptotic superlinear convergence rate of these quasi-Newton methods has been extensively studied, but their explicit finite time local convergence rate has not been established yet. In this paper, we provide a finite time (non-asymptotic) convergence analysis for BFGS and DFP methods under the assumptions that the objective function is strongly convex, its gradient is Lipschitz continuous, and its Hessian is Lipschitz continuous only in the direction of the optimal solution. We show that in a local neighborhood of the optimal solution, the iterates generated by both DFP and BFGS converge to the optimal solution at a superlinear rate of O((1/k)^k/2), where k is the number of iterations. In particular, for a specific choice of the local neighborhood, both DFP and BFGS converge to the optimal solution at the rate of (0.85/k)^k/2. Our theoretical guarantee is one of the first results that provide a non-asymptotic superlinear convergence rate for DFP and BFGS quasi-Newton methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/04/2021

Quasi-Newton Methods for Saddle Point Problems

This paper studies quasi-Newton methods for solving strongly-convex-stro...
research
06/10/2021

Exploiting Local Convergence of Quasi-Newton Methods Globally: Adaptive Sample Size Approach

In this paper, we study the application of quasi-Newton methods for solv...
research
08/06/2019

On Convergence of Distributed Approximate Newton Methods: Globalization, Sharper Bounds and Beyond

The DANE algorithm is an approximate Newton method popularly used for co...
research
06/27/2023

Limited-Memory Greedy Quasi-Newton Method with Non-asymptotic Superlinear Convergence Rate

Non-asymptotic convergence analysis of quasi-Newton methods has gained a...
research
05/26/2023

Sharpened Lazy Incremental Quasi-Newton Method

We consider the finite sum minimization of n strongly convex and smooth ...
research
04/20/2022

Hessian Averaging in Stochastic Newton Methods Achieves Superlinear Convergence

We consider minimizing a smooth and strongly convex objective function u...
research
11/30/2022

Newton Method with Variable Selection by the Proximal Gradient Method

In sparse estimation, in which the sum of the loss function and the regu...

Please sign up or login with your details

Forgot password? Click here to reset