DeepAI AI Chat
Log In Sign Up

Compact representations of structured BFGS matrices

by   Johannes J. Brust, et al.

For general large-scale optimization problems compact representations exist in which recursive quasi-Newton update formulas are represented as compact matrix factorizations. For problems in which the objective function contains additional structure, so-called structured quasi-Newton methods exploit available second-derivative information and approximate unavailable second derivatives. This article develops the compact representations of two structured Broyden-Fletcher-Goldfarb-Shanno update formulas. The compact representations enable efficient limited memory and initialization strategies. Two limited memory line search algorithms are described and tested on a collection of problems, including a real world large scale imaging application.


page 1

page 2

page 3

page 4


Regularization of Limited Memory Quasi-Newton Methods for Large-Scale Nonconvex Minimization

This paper deals with the unconstrained optimization of smooth objective...

A fast quasi-Newton-type method for large-scale stochastic optimisation

During recent years there has been an increased interest in stochastic a...

Compact Optimization Algorithms with Re-sampled Inheritance

Compact optimization algorithms are a class of Estimation of Distributio...

HPC compact quasi-Newton algorithm for interface problems

In this work we present a robust interface coupling algorithm called Com...

On the efficiency of Stochastic Quasi-Newton Methods for Deep Learning

While first-order methods are popular for solving optimization problems ...

Compact Quasi-Newton preconditioners for SPD linear systems

In this paper preconditioners for the Conjugate Gradient method are stud...

On the construction of probabilistic Newton-type algorithms

It has recently been shown that many of the existing quasi-Newton algori...