A fast quasi-Newton-type method for large-scale stochastic optimisation

09/29/2018
by   Adrian Wills, et al.
0

During recent years there has been an increased interest in stochastic adaptations of limited memory quasi-Newton methods, which compared to pure gradient-based routines can improve the convergence by incorporating second order information. In this work we propose a direct least-squares approach conceptually similar to the limited memory quasi-Newton methods, but that computes the search direction in a slightly different way. This is achieved in a fast and numerically robust manner by maintaining a Cholesky factor of low dimension. This is combined with a stochastic line search relying upon fulfilment of the Wolfe condition in a backtracking manner, where the step length is adaptively modified with respect to the optimisation progress. We support our new algorithm by providing several theoretical results guaranteeing its performance. The performance is demonstrated on real-world benchmark problems which shows improved results in comparison with already established methods.

READ FULL TEXT

page 16

page 17

research
09/09/2019

A Stochastic Quasi-Newton Method with Nesterov's Accelerated Gradient

Incorporating second order curvature information in gradient based metho...
research
09/03/2019

Stochastic quasi-Newton with line-search regularization

In this paper we present a novel quasi-Newton algorithm for use in stoch...
research
02/12/2018

Stochastic quasi-Newton with adaptive step lengths for large-scale problems

We provide a numerically robust and fast method capable of exploiting th...
research
04/05/2017

On the construction of probabilistic Newton-type algorithms

It has recently been shown that many of the existing quasi-Newton algori...
research
07/29/2022

Compact representations of structured BFGS matrices

For general large-scale optimization problems compact representations ex...
research
04/02/2020

Using gradient directions to get global convergence of Newton-type methods

The renewed interest in Steepest Descent (SD) methods following the work...
research
10/01/2020

Single-stage gradient-based stellarator coil design: Optimization for near-axis quasi-symmetry

We present a new coil design paradigm for magnetic confinement in stella...

Please sign up or login with your details

Forgot password? Click here to reset