Global convergence of ResNets: From finite to infinite width using linear parameterization

12/10/2021
by   Raphaël Barboni, et al.
0

Overparametrization is a key factor in the absence of convexity to explain global convergence of gradient descent (GD) for neural networks. Beside the well studied lazy regime, infinite width (mean field) analysis has been developed for shallow networks, using on convex optimization technics. To bridge the gap between the lazy and mean field regimes, we study Residual Networks (ResNets) in which the residual block has linear parametrization while still being nonlinear. Such ResNets admit both infinite depth and width limits, encoding residual blocks in a Reproducing Kernel Hilbert Space (RKHS). In this limit, we prove a local Polyak-Lojasiewicz inequality. Thus, every critical point is a global minimizer and a local convergence result of GD holds, retrieving the lazy regime. In contrast with other mean-field studies, it applies to both parametric and non-parametric cases under an expressivity condition on the residuals. Our analysis leads to a practical and quantified recipe: starting from a universal RKHS, Random Fourier Features are applied to obtain a finite dimensional parameterization satisfying with high-probability our expressivity condition.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/11/2021

Global Convergence of Three-layer Neural Networks in the Mean Field Regime

In the mean field regime, neural networks are appropriately scaled so th...
research
01/30/2020

A Rigorous Framework for the Mean Field Limit of Multilayer Neural Networks

We develop a mathematically rigorous framework for multilayer neural net...
research
03/11/2020

A Mean-field Analysis of Deep ResNet and Beyond: Towards Provable Optimization Via Overparameterization From Depth

Training deep neural networks with stochastic gradient descent (SGD) can...
research
03/12/2020

Towards a General Theory of Infinite-Width Limits of Neural Classifiers

Obtaining theoretical guarantees for neural networks training appears to...
research
07/05/2023

From NeurODEs to AutoencODEs: a mean-field control framework for width-varying Neural Networks

In our work, we build upon the established connection between Residual N...
research
06/16/2020

A Note on the Global Convergence of Multilayer Neural Networks in the Mean Field Regime

In a recent work, we introduced a rigorous framework to describe the mea...
research
12/31/2020

Particle Dual Averaging: Optimization of Mean Field Neural Networks with Global Convergence Rate Analysis

We propose the particle dual averaging (PDA) method, which generalizes t...

Please sign up or login with your details

Forgot password? Click here to reset