Implicit regularization for convex regularizers

06/17/2020
by   Cesare Molinari, et al.
0

We study implicit regularization for over-parameterized linear models, when the bias is convex but not necessarily strongly convex. We characterize the regularization property of a primal-dual gradient based approach, analyzing convergence and especially stability in the presence of worst case deterministic noise. As a main example, we specialize and illustrate the results for the problem of robust sparse recovery. Key to our analysis is a combination of ideas from regularization theory and optimization in the presence of errors. Theoretical results are complemented by experiments showing that state-of-the-art performances are achieved with considerable computational speed-ups.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/01/2022

Iterative regularization for low complexity regularizers

Iterative regularization exploits the implicit bias of an optimization a...
research
08/04/2022

Transformed Primal-Dual Methods For Nonlinear Saddle Point Systems

A transformed primal-dual (TPD) flow is developed for a class of nonline...
research
05/26/2016

Generalization Properties and Implicit Regularization for Multiple Passes SGM

We study the generalization properties of stochastic gradient methods fo...
research
06/09/2021

From inexact optimization to learning via gradient concentration

Optimization was recently shown to control the inductive bias in a learn...
research
11/08/2017

Learning Sparse Visual Representations with Leaky Capped Norm Regularizers

Sparsity inducing regularization is an important part for learning over-...
research
08/09/2023

How to induce regularization in generalized linear models: A guide to reparametrizing gradient flow

In this work, we analyze the relation between reparametrizations of grad...
research
02/28/2022

Robust Training under Label Noise by Over-parameterization

Recently, over-parameterized deep networks, with increasingly more netwo...

Please sign up or login with your details

Forgot password? Click here to reset