Hessian Initialization Strategies for L-BFGS Solving Non-linear Inverse Problems

03/18/2021
by   Hari Om Aggrawal, et al.
0

L-BFGS is the state-of-the-art optimization method for many large scale inverse problems. It has a small memory footprint and achieves superlinear convergence. The method approximates Hessian based on an initial approximation and an update rule that models current local curvature information. The initial approximation greatly affects the scaling of a search direction and the overall convergence of the method. We propose a novel, simple, and effective way to initialize the Hessian. Typically, the objective function is a sum of a data-fidelity term and a regularizer. Often, the Hessian of the data-fidelity is computationally challenging, but the regularizer's Hessian is easy to compute. We replace the Hessian of the data-fidelity with a scalar and keep the Hessian of the regularizer to initialize the Hessian approximation at every iteration. The scalar satisfies the secant equation in the sense of ordinary and total least squares and geometric mean regression. Our new strategy not only leads to faster convergence, but the quality of the numerical solutions is generally superior to simple scaling based strategies. Specifically, the proposed schemes based on ordinary least squares formulation and geometric mean regression outperform the state-of-the-art schemes. The implementation of our strategy requires only a small change of a standard L-BFGS code. Our experiments on convex quadratic problems and non-convex image registration problems confirm the effectiveness of the proposed approach.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/18/2021

A polarization tensor approximation for the Hessian in iterative solvers for non-linear inverse problems

For many inverse parameter problems for partial differential equations i...
research
10/16/2021

Nys-Curve: Nyström-Approximated Curvature for Stochastic Optimization

The quasi-Newton methods generally provide curvature information by appr...
research
09/06/2020

Convergence Analysis of the Hessian Estimation Evolution Strategy

The class of algorithms called Hessian Estimation Evolution Strategies (...
research
11/11/2013

Second-order Shape Optimization for Geometric Inverse Problems in Vision

We develop a method for optimization in shape spaces, i.e., sets of surf...
research
06/15/2020

Total Deep Variation: A Stable Regularizer for Inverse Problems

Various problems in computer vision and medical imaging can be cast as i...
research
03/08/2022

A Jacobian Free Deterministic Method for Solving Inverse Problems

An effective numerical method is presented for optimizing model paramete...
research
10/03/2020

Secant Penalized BFGS: A Noise Robust Quasi-Newton Method Via Penalizing The Secant Condition

In this paper, we introduce a new variant of the BFGS method designed to...

Please sign up or login with your details

Forgot password? Click here to reset