The loss landscape of overparameterized neural networks

04/26/2018
by   Y Cooper, et al.
0

We explore some mathematical features of the loss landscape of overparameterized neural networks. A priori one might imagine that the loss function looks like a typical function from R^n to R - in particular, nonconvex, with discrete global minima. In this paper, we prove that in at least one important way, the loss function of an overparameterized neural network does not look like a typical function. If a neural net has n parameters and is trained on d data points, with n>d, we show that the locus M of global minima of L is usually not discrete, but rather an n-d dimensional submanifold of R^n. In practice, neural nets commonly have orders of magnitude more parameters than data points, so this observation implies that M is typically a very high-dimensional subset of R^n.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/20/2023

Interpolation property of shallow neural networks

We study the geometry of global minima of the loss landscape of overpara...
research
04/06/2018

The Loss Surface of XOR Artificial Neural Networks

Training an artificial neural network involves an optimization process o...
research
03/02/2018

Essentially No Barriers in Neural Network Energy Landscape

Training neural networks involves finding minima of a high-dimensional n...
research
11/29/2019

Barcodes as summary of objective function's topology

We apply the canonical forms (barcodes) of gradient Morse complexes to e...
research
04/28/2019

Support Vector Regression via a Combined Reward Cum Penalty Loss Function

In this paper, we introduce a novel combined reward cum penalty loss fun...
research
04/24/2020

Nonconvex penalization for sparse neural networks

Training methods for artificial neural networks often rely on over-param...
research
11/12/2022

On the High Symmetry of Neural Network Functions

Training neural networks means solving a high-dimensional optimization p...

Please sign up or login with your details

Forgot password? Click here to reset