Convergence rates for critical point regularization

02/17/2023
by   Daniel Obmann, et al.
0

Tikhonov regularization involves minimizing the combination of a data discrepancy term and a regularizing term, and is the standard approach for solving inverse problems. The use of non-convex regularizers, such as those defined by trained neural networks, has been shown to be effective in many cases. However, finding global minimizers in non-convex situations can be challenging, making existing theory inapplicable. A recent development in regularization theory relaxes this requirement by providing convergence based on critical points instead of strict minimizers. This paper investigates convergence rates for the regularization with critical points using Bregman distances. Furthermore, we show that when implementing near-minimization through an iterative algorithm, a finite number of iterations is sufficient without affecting convergence rates.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/23/2021

Shearlet-based regularization in statistical inverse learning with an application to X-ray tomography

Statistical inverse learning theory, a field that lies at the intersecti...
research
08/08/2019

Sparse ℓ^q-regularization of inverse problems with deep learning

We propose a sparse reconstruction framework for solving inverse problem...
research
07/28/2021

Global minimizers, strict and non-strict saddle points, and implicit regularization for deep linear neural networks

In non-convex settings, it is established that the behavior of gradient-...
research
02/21/2020

Source Conditions for non-quadratic Tikhonov Regularisation

In this paper we consider convex Tikhonov regularisation for the solutio...
research
02/01/2020

Deep synthesis regularization of inverse problems

Recently, a large number of efficient deep learning methods for solving ...
research
06/15/2021

Non-asymptotic convergence bounds for Wasserstein approximation using point clouds

Several issues in machine learning and inverse problems require to gener...
research
12/25/2022

Gromov-Wasserstein Distances: Entropic Regularization, Duality, and Sample Complexity

The Gromov-Wasserstein (GW) distance quantifies dissimilarity between me...

Please sign up or login with your details

Forgot password? Click here to reset