Implicit Regularization for Optimal Sparse Recovery

09/11/2019
by   Tomas Vaškevičius, et al.
0

We investigate implicit regularization schemes for gradient descent methods applied to unpenalized least squares regression to solve the problem of reconstructing a sparse signal from an underdetermined system of linear measurements under the restricted isometry assumption. For a given parametrization yielding a non-convex optimization problem, we show that prescribed choices of initialization, step size and stopping time yield a statistically and computationally optimal algorithm that achieves the minimax rate with the same cost required to read the data up to poly-logarithmic factors. Beyond minimax optimality, we show that our algorithm adapts to instance difficulty and yields a dimension-independent rate when the signal-to-noise ratio is high enough. Key to the computational efficiency of our method is an increasing step size scheme that adapts to refined estimates of the true solution. We validate our findings with numerical experiments and compare our algorithm against explicit ℓ_1 penalization. Going from hard instances to easy ones, our algorithm is seen to undergo a phase transition, eventually matching least squares with an oracle knowledge of the true support.

READ FULL TEXT
research
08/12/2021

Implicit Sparse Regularization: The Impact of Depth and Early Stopping

In this paper, we study the implicit bias of gradient descent for sparse...
research
03/22/2019

Implicit Regularization via Hadamard Product Over-Parametrization in High-Dimensional Linear Regression

We consider Hadamard product parametrization as a change-of-variable (ov...
research
05/08/2021

Nearly Minimax-Optimal Rates for Noisy Sparse Phase Retrieval via Early-Stopped Mirror Descent

This paper studies early-stopped mirror descent applied to noisy sparse ...
research
06/16/2020

Least Squares Regression with Markovian Data: Fundamental Limits and Algorithms

We study the problem of least squares linear regression where the data-p...
research
06/01/2020

Hadamard Wirtinger Flow for Sparse Phase Retrieval

We consider the problem of reconstructing an n-dimensional k-sparse sign...
research
05/26/2016

Generalization Properties and Implicit Regularization for Multiple Passes SGM

We study the generalization properties of stochastic gradient methods fo...
research
06/06/2023

Online Tensor Learning: Computational and Statistical Trade-offs, Adaptivity and Optimal Regret

We investigate a generalized framework for estimating latent low-rank te...

Please sign up or login with your details

Forgot password? Click here to reset