DeepAI AI Chat
Log In Sign Up

A Bregman-Kaczmarz method for nonlinear systems of equations

03/15/2023
by   Robert Gower, et al.
0

We propose a new randomized method for solving systems of nonlinear equations, which can find sparse solutions or solutions under certain simple constraints. The scheme only takes gradients of component functions and uses Bregman projections onto the solution space of a Newton equation. In the special case of euclidean projections, the method is known as nonlinear Kaczmarz method. Furthermore, if the component functions are nonnegative, we are in the setting of optimization under the interpolation assumption and the method reduces to SGD with the recently proposed stochastic Polyak step size. For general Bregman projections, our method is a stochastic mirror descent with a novel adaptive step size. We prove that in the convex setting each iteration of our method results in a smaller Bregman distance to exact solutions as compared to the standard Polyak step. Our generalization to Bregman projections comes with the price that a convex one-dimensional optimization problem needs to be solved in each iteration. This can typically be done with globalized Newton iterations. Convergence is proved in two classical settings of nonlinearity: for convex nonnegative functions and locally for functions which fulfill the tangential cone condition. Finally, we show examples in which the proposed method outperforms similar methods with the same memory requirements.

READ FULL TEXT

page 1

page 2

page 3

page 4

06/18/2020

SGD for Structured Nonconvex Functions: Learning Rates, Minibatching and Interpolation

We provide several convergence theorems for SGD for two large classes of...
10/08/2019

A Global Newton-Type Scheme Based on a Simplified Newton-Type Approach

Globalization concepts for Newton-type iteration schemes are widely used...
09/09/2013

The Linearized Bregman Method via Split Feasibility Problems: Analysis and Generalizations

The linearized Bregman method is a method to calculate sparse solutions ...
11/08/2019

MindTheStep-AsyncPSGD: Adaptive Asynchronous Parallel Stochastic Gradient Descent

Stochastic Gradient Descent (SGD) is very useful in optimization problem...
09/01/2021

Solving the Discrete Euler-Arnold Equations for the Generalized Rigid Body Motion

We propose three iterative methods for solving the Moser-Veselov equatio...
06/22/2020

Sketched Newton-Raphson

We propose a new globally convergent stochastic second order method. Our...
11/01/2021

Fast Newton Iterative Method for Local Steric Poisson–Boltzmann Theories in Biomolecular Solvation

This work proposes a fast iterative method for local steric Poisson–Bolt...

Code Repositories