Training Two-Layer ReLU Networks with Gradient Descent is Inconsistent

02/12/2020
by   David Holzmüller, et al.
0

We prove that two-layer (Leaky)ReLU networks initialized by e.g. the widely used method proposed by He et al. (2015) and trained using gradient descent on a least-squares loss are not universally consistent. Specifically, we describe a large class of data-generating distributions for which, with high probability, gradient descent only finds a bad local minimum of the optimization landscape. It turns out that in these cases, the found network essentially performs linear regression even if the target function is non-linear. We further provide numerical evidence that this happens in practical situations and that stochastic gradient descent exhibits similar behavior.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/06/2023

Training a Two Layer ReLU Network Analytically

Neural networks are usually trained with different variants of gradient ...
research
09/28/2018

Efficiently testing local optimality and escaping saddles for ReLU networks

We provide a theoretical algorithm for checking local optimality and esc...
research
12/13/2014

The Statistics of Streaming Sparse Regression

We present a sparse analogue to stochastic gradient descent that is guar...
research
08/04/2022

Agnostic Learning of General ReLU Activation Using Gradient Descent

We provide a convergence analysis of gradient descent for the problem of...
research
05/21/2020

Can Shallow Neural Networks Beat the Curse of Dimensionality? A mean field training perspective

We prove that the gradient descent training of a two-layer neural networ...
research
07/20/2022

A note on the variation of geometric functionals

Calculus of Variation combined with Differential Geometry as tools of mo...
research
05/27/2020

On the Convergence of Gradient Descent Training for Two-layer ReLU-networks in the Mean Field Regime

We describe a necessary and sufficient condition for the convergence to ...

Please sign up or login with your details

Forgot password? Click here to reset