Regularization of Limited Memory Quasi-Newton Methods for Large-Scale Nonconvex Minimization

11/11/2019
by   Daniel Steck, et al.
0

This paper deals with the unconstrained optimization of smooth objective functions. It presents a class of regularized quasi-Newton methods whose globalization turns out to be more efficient than standard line search or trust-region strategies. The focus is therefore on the solution of large-scale problems using limited memory quasi-Newton techniques. Global convergence of the regularization methods is shown under mild assumptions. The details of the regularized limited memory quasi-Newton updates are discussed including their compact representations. Numerical results using all large-scale test problems from the CUTEst collection indicate that the regularization method outperforms the standard line search limited memory BFGS method.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/29/2022

Compact representations of structured BFGS matrices

For general large-scale optimization problems compact representations ex...
research
04/19/2022

A Novel Fast Exact Subproblem Solver for Stochastic Quasi-Newton Cubic Regularized Optimization

In this work we describe an Adaptive Regularization using Cubics (ARC) m...
research
04/24/2008

A Quasi-Newton Approach to Nonsmooth Convex Optimization Problems in Machine Learning

We extend the well-known BFGS quasi-Newton method and its memory-limited...
research
12/10/2019

A Stochastic Quasi-Newton Method for Large-Scale Nonconvex Optimization with Applications

This paper proposes a novel stochastic version of damped and regularized...
research
11/25/2022

Nonlinear Schwarz preconditioning for Quasi-Newton methods

We propose the nonlinear restricted additive Schwarz (RAS) preconditioni...
research
01/29/2019

The projected Newton-Kleinman method for the algebraic Riccati equation

The numerical solution of the algebraic Riccati equation is a challengin...
research
07/13/2021

A New Multipoint Symmetric Secant Method with a Dense Initial Matrix

In large-scale optimization, when either forming or storing Hessian matr...

Please sign up or login with your details

Forgot password? Click here to reset