Landscape analysis for shallow ReLU neural networks: complete classification of critical points for affine target functions

03/19/2021
by   Patrick Cheridito, et al.
0

In this paper, we analyze the landscape of the true loss of a ReLU neural network with one hidden layer. We provide a complete classification of the critical points in the case where the target function is affine. In particular, we prove that local minima and saddle points have to be of a special form and show that there are no local maxima. Our approach is of a combinatorial nature and builds on a careful analysis of the different types of hidden neurons that can occur in a ReLU neural network.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/12/2020

Understanding Global Loss Landscape of One-hidden-layer ReLU Neural Networks

For one-hidden-layer ReLU networks, we show that all local minima are gl...
research
06/10/2020

All Local Minima are Global for Two-Layer ReLU Neural Networks: The Hidden Convex Optimization Landscape

We are interested in two-layer ReLU neural networks from an optimization...
research
08/12/2021

On minimal representations of shallow ReLU networks

The realization function of a shallow ReLU network is a continuous and p...
research
10/17/2018

Finite sample expressive power of small-width ReLU networks

We study universal finite sample expressivity of neural networks, define...
research
05/06/2021

The layer-wise L1 Loss Landscape of Neural Nets is more complex around local minima

For fixed training data and network parameters in the other layers the L...
research
12/29/2017

The Multilinear Structure of ReLU Networks

We study the loss surface of neural networks equipped with a hinge loss ...
research
05/25/2021

Geometry of the Loss Landscape in Overparameterized Neural Networks: Symmetries and Invariances

We study how permutation symmetries in overparameterized multi-layer neu...

Please sign up or login with your details

Forgot password? Click here to reset