Incremental Learning in Diagonal Linear Networks

08/31/2022
by   Raphaël Berthier, et al.
0

Diagonal linear networks (DLNs) are a toy simplification of artificial neural networks; they consist in a quadratic reparametrization of linear regression inducing a sparse implicit regularization. In this paper, we describe the trajectory of the gradient flow of DLNs in the limit of small initialization. We show that incremental learning is effectively performed in the limit: coordinates are successively activated, while the iterate is the minimizer of the loss constrained to have support on the active coordinates only. This shows that the sparse implicit regularization of DLNs decreases with time. This work is restricted to the underparametrized regime with anti-correlated features for technical reasons.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/02/2023

Saddle-to-Saddle Dynamics in Diagonal Linear Networks

In this paper we fully describe the trajectory of gradient flow over dia...
research
01/29/2023

Implicit Regularization for Group Sparsity

We study the implicit regularization of gradient descent towards structu...
research
09/26/2019

The Implicit Bias of Depth: How Incremental Learning Drives Generalization

A leading hypothesis for the surprising generalization of neural network...
research
05/09/2023

Robust Implicit Regularization via Weight Normalization

Overparameterized models may have many interpolating solutions; implicit...
research
07/13/2023

Implicit regularization in AI meets generalized hardness of approximation in optimization – Sharp results for diagonal linear networks

Understanding the implicit regularization imposed by neural network arch...
research
10/20/2021

Convergence Analysis and Implicit Regularization of Feedback Alignment for Deep Linear Networks

We theoretically analyze the Feedback Alignment (FA) algorithm, an effic...
research
07/15/2021

Lockout: Sparse Regularization of Neural Networks

Many regression and classification procedures fit a parameterized functi...

Please sign up or login with your details

Forgot password? Click here to reset