A New Multipoint Symmetric Secant Method with a Dense Initial Matrix

07/13/2021
by   Jennifer B. Erway, et al.
0

In large-scale optimization, when either forming or storing Hessian matrices are prohibitively expensive, quasi-Newton methods are often. used in lieu of Newton's method because they only require first-order information to approximate the true Hessian. Multipoint symmetric secant (MSS) methods can be thought of as generalizations of quasi-Newton methods in that they attempt to impose additional requirements on their approximation of the Hessian. Given an initial Hessian approximation, MSS methods generate a sequence of matrices using rank-2 updates. For practical reasons, up to now, the initialization has been a constant multiple of the identity matrix. In this paper, we propose a new limited-memory MSS method that allows for dense initializations. Numerical results on the CUTEst test problems suggest that the MSS method using a dense initialization outperforms the standard initialization. Numerical results also suggest that this approach is competitive with a basic L-SR1 trust-region method.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/06/2020

Generalization of Quasi-Newton Methods: Application to Robust Symmetric Multisecant Updates

Quasi-Newton techniques approximate the Newton step by estimating the He...
research
02/13/2019

Do Subsampled Newton Methods Work for High-Dimensional Data?

Subsampled Newton methods approximate Hessian matrices through subsampli...
research
07/01/2018

Trust-Region Algorithms for Training Responses: Machine Learning Methods Using Indefinite Hessian Approximations

Machine learning (ML) problems are often posed as highly nonlinear and n...
research
11/11/2019

Regularization of Limited Memory Quasi-Newton Methods for Large-Scale Nonconvex Minimization

This paper deals with the unconstrained optimization of smooth objective...
research
06/27/2023

Limited-Memory Greedy Quasi-Newton Method with Non-asymptotic Superlinear Convergence Rate

Non-asymptotic convergence analysis of quasi-Newton methods has gained a...
research
10/22/2022

An Efficient Nonlinear Acceleration method that Exploits Symmetry of the Hessian

Nonlinear acceleration methods are powerful techniques to speed up fixed...
research
10/11/2022

Learning to Optimize Quasi-Newton Methods

We introduce a novel machine learning optimizer called LODO, which onlin...

Please sign up or login with your details

Forgot password? Click here to reset