Global Convergence of Second-order Dynamics in Two-layer Neural Networks

07/14/2020
by   Walid Krichene, et al.
0

Recent results have shown that for two-layer fully connected neural networks, gradient flow converges to a global optimum in the infinite width limit, by making a connection between the mean field dynamics and the Wasserstein gradient flow. These results were derived for first-order gradient flow, and a natural question is whether second-order dynamics, i.e., dynamics with momentum, exhibit a similar guarantee. We show that the answer is positive for the heavy ball method. In this case, the resulting integro-PDE is a nonlinear kinetic Fokker Planck equation, and unlike the first-order case, it has no apparent connection with the Wasserstein gradient flow. Instead, we study the variations of a Lyapunov functional along the solution trajectories to characterize the stationary points and to prove convergence. While our results are asymptotic in the mean field limit, numerical simulations indicate that global convergence may already occur for reasonably small networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/17/2021

A Riemannian Mean Field Formulation for Two-layer Neural Networks with Batch Normalization

The training dynamics of two-layer neural networks with batch normalizat...
research
05/19/2019

Mean-Field Langevin Dynamics and Energy Landscape of Neural Networks

We present a probabilistic analysis of the long-time behaviour of the no...
research
06/17/2018

Exact information propagation through fully-connected feed forward neural networks

Neural network ensembles at initialisation give rise to the trainability...
research
10/28/2022

A Functional-Space Mean-Field Theory of Partially-Trained Three-Layer Neural Networks

To understand the training dynamics of neural networks (NNs), prior stud...
research
10/13/2022

Mean-field analysis for heavy ball methods: Dropout-stability, connectivity, and global convergence

The stochastic heavy ball method (SHB), also known as stochastic gradien...
research
11/18/2021

Gradient flows on graphons: existence, convergence, continuity equations

Wasserstein gradient flows on probability measures have found a host of ...
research
11/16/2022

On the symmetries in the dynamics of wide two-layer neural networks

We consider the idealized setting of gradient flow on the population ris...

Please sign up or login with your details

Forgot password? Click here to reset