DeepAI AI Chat
Log In Sign Up

Compact representations of structured BFGS matrices

07/29/2022
by   Johannes J. Brust, et al.
0

For general large-scale optimization problems compact representations exist in which recursive quasi-Newton update formulas are represented as compact matrix factorizations. For problems in which the objective function contains additional structure, so-called structured quasi-Newton methods exploit available second-derivative information and approximate unavailable second derivatives. This article develops the compact representations of two structured Broyden-Fletcher-Goldfarb-Shanno update formulas. The compact representations enable efficient limited memory and initialization strategies. Two limited memory line search algorithms are described and tested on a collection of problems, including a real world large scale imaging application.

READ FULL TEXT

page 1

page 2

page 3

page 4

11/11/2019

Regularization of Limited Memory Quasi-Newton Methods for Large-Scale Nonconvex Minimization

This paper deals with the unconstrained optimization of smooth objective...
09/29/2018

A fast quasi-Newton-type method for large-scale stochastic optimisation

During recent years there has been an increased interest in stochastic a...
09/12/2018

Compact Optimization Algorithms with Re-sampled Inheritance

Compact optimization algorithms are a class of Estimation of Distributio...
05/22/2020

HPC compact quasi-Newton algorithm for interface problems

In this work we present a robust interface coupling algorithm called Com...
05/18/2022

On the efficiency of Stochastic Quasi-Newton Methods for Deep Learning

While first-order methods are popular for solving optimization problems ...
01/04/2020

Compact Quasi-Newton preconditioners for SPD linear systems

In this paper preconditioners for the Conjugate Gradient method are stud...
04/05/2017

On the construction of probabilistic Newton-type algorithms

It has recently been shown that many of the existing quasi-Newton algori...