Secant Penalized BFGS: A Noise Robust Quasi-Newton Method Via Penalizing The Secant Condition

10/03/2020
by   Brian Irwin, et al.
0

In this paper, we introduce a new variant of the BFGS method designed to perform well when gradient measurements are corrupted by noise. We show that by treating the secant condition using a penalty method approach, one can smoothly interpolate between updating the inverse Hessian approximation with the original BFGS update formula and not updating the inverse Hessian approximation. Furthermore, we find the curvature condition is smoothly relaxed as the interpolation moves towards not updating the inverse Hessian approximation, disappearing entirely when the inverse Hessian approximation is not updated. These developments allow us to develop an algorithm we refer to as secant penalized BFGS (SP-BFGS) that allows one to relax the secant condition based on the amount of noise in the gradient measurements. Mathematically, SP-BFGS provides a means of incrementally updating the new inverse Hessian approximation with a controlled amount of bias towards the previous inverse Hessian approximation. Practically speaking, this can be used to help replace the overwriting nature of the BFGS update with an averaging nature that resists the destructive effects of noise. We provide a convergence analysis of SP-BFGS, and present numerical results illustrating the performance of SP-BFGS in the presence of noisy gradients.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/27/2014

A Stochastic Quasi-Newton Method for Large-Scale Optimization

The question of how to incorporate curvature information in stochastic a...
research
02/16/2023

Online Learning Guided Curvature Approximation: A Quasi-Newton Method with Global Non-Asymptotic Superlinear Convergence

Quasi-Newton algorithms are among the most popular iterative methods for...
research
06/27/2023

Limited-Memory Greedy Quasi-Newton Method with Non-asymptotic Superlinear Convergence Rate

Non-asymptotic convergence analysis of quasi-Newton methods has gained a...
research
06/17/2022

FedNew: A Communication-Efficient and Privacy-Preserving Newton-Type Method for Federated Learning

Newton-type methods are popular in federated learning due to their fast ...
research
05/28/2019

Distributed estimation of the inverse Hessian by determinantal averaging

In distributed optimization and distributed numerical linear algebra, we...
research
03/18/2021

Hessian Initialization Strategies for L-BFGS Solving Non-linear Inverse Problems

L-BFGS is the state-of-the-art optimization method for many large scale ...
research
04/18/2021

A polarization tensor approximation for the Hessian in iterative solvers for non-linear inverse problems

For many inverse parameter problems for partial differential equations i...

Please sign up or login with your details

Forgot password? Click here to reset