DeepAI AI Chat
Log In Sign Up

SGN: Sparse Gauss-Newton for Accelerated Sensitivity Analysis

07/06/2021
by   Jonas Zehnder, et al.
ETH Zurich
Université de Montréal
0

We present a sparse Gauss-Newton solver for accelerated sensitivity analysis with applications to a wide range of equilibrium-constrained optimization problems. Dense Gauss-Newton solvers have shown promising convergence rates for inverse problems, but the cost of assembling and factorizing the associated matrices has so far been a major stumbling block. In this work, we show how the dense Gauss-Newton Hessian can be transformed into an equivalent sparse matrix that can be assembled and factorized much more efficiently. This leads to drastically reduced computation times for many inverse problems, which we demonstrate on a diverse set of examples. We furthermore show links between sensitivity analysis and nonlinear programming approaches based on Lagrange multipliers and prove equivalence under specific assumptions that apply for our problem setting.

READ FULL TEXT
11/13/2019

Benchmarking results for the Newton-Anderson method

This paper primarily presents numerical results for the Anderson acceler...
02/19/2020

Relaxed Gauss-Newton methods with applications to electrical impedance tomography

As second-order methods, Gauss–Newton-type methods can be more effective...
12/14/2020

An efficient Quasi-Newton method for nonlinear inverse problems via learned singular values

Solving complex optimization problems in engineering and the physical sc...
09/30/2018

Newton-MR: Newton's Method Without Smoothness or Convexity

Establishing global convergence of the classical Newton's method has lon...
10/07/2019

All-at-once versus reduced iterative methods for time dependent inverse problems

In this paper we investigate all-at-once versus reduced regularization o...