The Average Rate of Convergence of the Exact Line Search Gradient Descent Method

05/16/2023
by   Thomas Yu, et al.
0

It is very well-known that when the exact line search gradient descent method is applied to a convex quadratic objective, the worst case rate of convergence (among all seed vectors) deteriorates as the condition number of the Hessian of the objective grows. By an elegant analysis by H. Akaike, it is generally believed – but not proved – that in the ill-conditioned regime the ROC for almost all initial vectors, and hence also the average ROC, is close to the worst case ROC. We complete Akaike's analysis using the theorem of center and stable manifolds. Our analysis also makes apparent the effect of an intermediate eigenvalue in the Hessian by establishing the following somewhat amusing result: In the absence of an intermediate eigenvalue, the average ROC gets arbitrarily fast – not slow – as the Hessian gets increasingly ill-conditioned. We discuss in passing some contemporary applications of exact line search GD to polynomial optimization problems arising from imaging and data sciences.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/31/2020

LSOS: Line-search Second-Order Stochastic optimization methods

We develop a line-search second-order algorithmic framework for optimiza...
research
09/03/2022

Quadratic Gradient: Uniting Gradient Algorithm and Newton Method as One

It might be inadequate for the line search technique for Newton's method...
research
03/26/2021

Automated Worst-Case Performance Analysis of Decentralized Gradient Descent

We develop a methodology to automatically compute worst-case performance...
research
05/29/2021

On Centralized and Distributed Mirror Descent: Exponential Convergence Analysis Using Quadratic Constraints

Mirror descent (MD) is a powerful first-order optimization technique tha...
research
12/14/2020

A spectral characterization and an approximation scheme for the Hessian eigenvalue

We revisit the k-Hessian eigenvalue problem on a smooth, bounded, (k-1)-...
research
06/20/2022

Only Tails Matter: Average-Case Universality and Robustness in the Convex Regime

The recently developed average-case analysis of optimization methods all...
research
05/11/2020

On Radial Isotropic Position: Theory and Algorithms

We review the theory of, and develop algorithms for transforming a finit...

Please sign up or login with your details

Forgot password? Click here to reset