Limited memory gradient methods for unconstrained optimization

08/29/2023
by   Giulia Ferrandi, et al.
0

The limited memory steepest descent method (Fletcher, 2012) for unconstrained optimization problems stores a few past gradients to compute multiple stepsizes at once. We review this method and propose new variants. For strictly convex quadratic objective functions, we study the numerical behavior of different techniques to compute new stepsizes. In particular, we introduce a method to improve the use of harmonic Ritz values. We also show the existence of a secant condition associated with LMSD, where the approximating Hessian is projected onto a low-dimensional space. In the general nonlinear case, we propose two new alternatives to Fletcher's method: first, the addition of symmetry constraints to the secant condition valid for the quadratic case; second, a perturbation of the last differences between consecutive gradients, to satisfy multiple secant equations simultaneously. We show that Fletcher's method can also be interpreted from this viewpoint.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/21/2022

A harmonic framework for stepsize selectionin gradient methods

We study the use of inverse harmonic Rayleigh quotients with target for ...
research
05/14/2021

Variable Reduction For Quadratic Unconstrained Binary Optimization

Quadratic Unconstrained Binary Optimization models are useful for solvin...
research
08/23/2019

Proximal gradient flow and Douglas-Rachford splitting dynamics: global exponential stability via integral quadratic constraints

Many large-scale and distributed optimization problems can be brought in...
research
08/14/2020

Dimension Independence in Unconstrained Private ERM via Adaptive Preconditioning

In this paper we revisit the problem of private empirical risk minimziat...
research
01/27/2022

From the Ravine method to the Nesterov method and vice versa: a dynamical system perspective

We revisit the Ravine method of Gelfand and Tsetlin from a dynamical sys...
research
09/24/2020

A new multivariate meta-analysis model for many variates and few studies

Studies often estimate associations between an outcome and multiple vari...
research
07/17/2018

Minimizing convex quadratic with variable precision Krylov methods

Iterative algorithms for the solution of convex quadratic optimization p...

Please sign up or login with your details

Forgot password? Click here to reset