The Effects of Mild Over-parameterization on the Optimization Landscape of Shallow ReLU Neural Networks

06/01/2020
by   Itay Safran, et al.
29

We study the effects of mild over-parameterization on the optimization landscape of a simple ReLU neural network of the form x∑_i=1^kmax{0,w_i^x}, in a well-studied teacher-student setting where the target values are generated by the same architecture, and when directly optimizing over the population squared loss with respect to Gaussian inputs. We prove that while the objective is strongly convex around the global minima when the teacher and student networks possess the same number of neurons, it is not even locally convex after any amount of over-parameterization. Moreover, related desirable properties (e.g., one-point strong convexity and the Polyak-Łojasiewicz condition) also do not hold even locally. On the other hand, we establish that the objective remains one-point strongly convex in most directions (suitably defined). For the non-global minima, we prove that adding even just a single neuron will turn a non-global minimum into a saddle point. This holds under some technical conditions which we validate empirically. These results provide a possible explanation for why recovering a global minimum becomes significantly easier when we over-parameterize, even if the amount of over-parameterization is very moderate.

READ FULL TEXT

page 6

page 12

research
12/24/2017

Spurious Local Minima are Common in Two-Layer ReLU Neural Networks

We consider the optimization problem associated with training simple ReL...
research
06/11/2021

On Learnability via Gradient Method for Two-Layer ReLU Neural Networks in Teacher-Student Setting

Deep learning empirically achieves high performance in many applications...
research
10/12/2022

Annihilation of Spurious Minima in Two-Layer ReLU Networks

We study the optimization problem associated with fitting two-layer ReLU...
research
02/04/2021

A Local Convergence Theory for Mildly Over-Parameterized Two-Layer Neural Network

While over-parameterization is widely believed to be crucial for the suc...
research
05/22/2018

Adding One Neuron Can Eliminate All Bad Local Minima

One of the main difficulties in analyzing neural networks is the non-con...
research
11/20/2018

Effect of Depth and Width on Local Minima in Deep Learning

In this paper, we analyze the effects of depth and width on the quality ...
research
09/28/2018

A theoretical framework for deep locally connected ReLU network

Understanding theoretical properties of deep and locally connected nonli...

Please sign up or login with your details

Forgot password? Click here to reset