A Riemannian Mean Field Formulation for Two-layer Neural Networks with Batch Normalization

10/17/2021
by   Chao Ma, et al.
19

The training dynamics of two-layer neural networks with batch normalization (BN) is studied. It is written as the training dynamics of a neural network without BN on a Riemannian manifold. Therefore, we identify BN's effect of changing the metric in the parameter space. Later, the infinite-width limit of the two-layer neural networks with BN is considered, and a mean-field formulation is derived for the training dynamics. The training dynamics of the mean-field formulation is shown to be the Wasserstein gradient flow on the manifold. Theoretical analysis are provided on the well-posedness and convergence of the Wasserstein gradient flow.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/14/2020

Global Convergence of Second-order Dynamics in Two-layer Neural Networks

Recent results have shown that for two-layer fully connected neural netw...
research
06/19/2020

An analytic theory of shallow networks dynamics for hinge loss classification

Neural networks have been shown to perform incredibly well in classifica...
research
06/17/2018

Exact information propagation through fully-connected feed forward neural networks

Neural network ensembles at initialisation give rise to the trainability...
research
03/06/2019

Mean-field Analysis of Batch Normalization

Batch Normalization (BatchNorm) is an extremely useful component of mode...
research
05/19/2019

Mean-Field Langevin Dynamics and Energy Landscape of Neural Networks

We present a probabilistic analysis of the long-time behaviour of the no...
research
10/09/2018

Information Geometry of Orthogonal Initializations and Training

Recently mean field theory has been successfully used to analyze propert...
research
11/18/2021

Gradient flows on graphons: existence, convergence, continuity equations

Wasserstein gradient flows on probability measures have found a host of ...

Please sign up or login with your details

Forgot password? Click here to reset