Spurious Local Minima are Common in Two-Layer ReLU Neural Networks

12/24/2017
by   Itay Safran, et al.
0

We consider the optimization problem associated with training simple ReLU neural networks of the form x∑_i=1^n{0,w_i^x} with respect to the squared loss. We provide a computer-assisted proof that even if the input distribution is standard Gaussian, even if the dimension is unrestricted, and even if the target values are generated by such a network, with orthonormal parameter vectors, the problem can still have spurious local minima once k≥ 6. By a continuity argument, this implies that in high dimensions, nearly all target networks of the relevant sizes lead to spurious local minima. Moreover, we conduct experiments which show that the probability of hitting such local minima is quite high, and increasing with the network size. On the positive side, mild over-parameterization appears to drastically reduce such local minima, indicating that an over-parameterization assumption is necessary to get a positive result in this setting.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/12/2020

Understanding Global Loss Landscape of One-hidden-layer ReLU Neural Networks

For one-hidden-layer ReLU networks, we show that all local minima are gl...
research
06/01/2020

The Effects of Mild Over-parameterization on the Optimization Landscape of Shallow ReLU Neural Networks

We study the effects of mild over-parameterization on the optimization l...
research
12/26/2019

Spurious Local Minima of Shallow ReLU Networks Conform with the Symmetry of the Target Model

We consider the optimization problem associated with fitting two-layer R...
research
11/20/2018

Effect of Depth and Width on Local Minima in Deep Learning

In this paper, we analyze the effects of depth and width on the quality ...
research
05/31/2023

Mildly Overparameterized ReLU Networks Have a Favorable Loss Landscape

We study the loss landscape of two-layer mildly overparameterized ReLU n...
research
07/21/2021

Analytic Study of Families of Spurious Minima in Two-Layer ReLU Neural Networks

We study the optimization problem associated with fitting two-layer ReLU...
research
12/29/2017

The Multilinear Structure of ReLU Networks

We study the loss surface of neural networks equipped with a hinge loss ...

Please sign up or login with your details

Forgot password? Click here to reset