Implicit Gradient Regularization

09/23/2020
by   David G. T. Barrett, et al.
0

Gradient descent can be surprisingly good at optimizing deep neural networks without overfitting and without explicit regularization. We find that the discrete steps of gradient descent implicitly regularize models by penalizing gradient descent trajectories that have large loss gradients. We call this Implicit Gradient Regularization (IGR) and we use backward error analysis to calculate the size of this regularization. We confirm empirically that implicit gradient regularization biases gradient descent toward flat minima, where test errors are small and solutions are robust to noisy parameter perturbations. Furthermore, we demonstrate that the implicit gradient regularization term can be used as an explicit regularizer, allowing us to control this gradient regularization directly. More broadly, our work indicates that backward error analysis is a useful theoretical approach to the perennial question of how learning rate, model size, and parameter regularization interact to determine the properties of overparameterized models optimized with gradient descent.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/02/2023

Implicit regularization in Heavy-ball momentum accelerated stochastic gradient descent

It is well known that the finite step-size (h) in Gradient Descent (GD) ...
research
01/29/2023

Implicit Regularization for Group Sparsity

We study the implicit regularization of gradient descent towards structu...
research
11/22/2021

Depth Without the Magic: Inductive Bias of Natural Gradient Descent

In gradient descent, changing how we parametrize the model can lead to d...
research
03/31/2023

Per-Example Gradient Regularization Improves Learning Signals from Noisy Data

Gradient regularization, as described in <cit.>, is a highly effective t...
research
06/09/2022

Explicit Regularization in Overparametrized Models via Noise Injection

Injecting noise within gradient descent has several desirable features. ...
research
08/31/2023

On the Implicit Bias of Adam

In previous literature, backward error analysis was used to find ordinar...
research
05/11/2019

Linear Range in Gradient Descent

This paper defines linear range as the range of parameter perturbations ...

Please sign up or login with your details

Forgot password? Click here to reset