DeepAI AI Chat
Log In Sign Up

SGN: Sparse Gauss-Newton for Accelerated Sensitivity Analysis

by   Jonas Zehnder, et al.
ETH Zurich
Université de Montréal

We present a sparse Gauss-Newton solver for accelerated sensitivity analysis with applications to a wide range of equilibrium-constrained optimization problems. Dense Gauss-Newton solvers have shown promising convergence rates for inverse problems, but the cost of assembling and factorizing the associated matrices has so far been a major stumbling block. In this work, we show how the dense Gauss-Newton Hessian can be transformed into an equivalent sparse matrix that can be assembled and factorized much more efficiently. This leads to drastically reduced computation times for many inverse problems, which we demonstrate on a diverse set of examples. We furthermore show links between sensitivity analysis and nonlinear programming approaches based on Lagrange multipliers and prove equivalence under specific assumptions that apply for our problem setting.


Benchmarking results for the Newton-Anderson method

This paper primarily presents numerical results for the Anderson acceler...

Relaxed Gauss-Newton methods with applications to electrical impedance tomography

As second-order methods, Gauss–Newton-type methods can be more effective...

An efficient Quasi-Newton method for nonlinear inverse problems via learned singular values

Solving complex optimization problems in engineering and the physical sc...

Newton-MR: Newton's Method Without Smoothness or Convexity

Establishing global convergence of the classical Newton's method has lon...

All-at-once versus reduced iterative methods for time dependent inverse problems

In this paper we investigate all-at-once versus reduced regularization o...