Interpolation property of shallow neural networks

04/20/2023
by   Vlad-Raul Constantinescu, et al.
0

We study the geometry of global minima of the loss landscape of overparametrized neural networks. In most optimization problems, the loss function is convex, in which case we only have a global minima, or nonconvex, with a discrete number of global minima. In this paper, we prove that in the overparametrized regime, a shallow neural network can interpolate any data set, i.e. the loss function has a global minimum value equal to zero as long as the activation function is not a polynomial of small degree. Additionally, if such a global minimum exists, then the locus of global minima has infinitely many points. Furthermore, we give a characterization of the Hessian of the loss function evaluated at the global minima, and in the last section, we provide a practical probabilistic method of finding the interpolation point.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/26/2018

The loss landscape of overparameterized neural networks

We explore some mathematical features of the loss landscape of overparam...
research
02/07/2020

Ill-Posedness and Optimization Geometry for Nonlinear Neural Network Training

In this work we analyze the role nonlinear activation functions play at ...
research
03/02/2018

Essentially No Barriers in Neural Network Energy Landscape

Training neural networks involves finding minima of a high-dimensional n...
research
05/25/2021

Geometry of the Loss Landscape in Overparameterized Neural Networks: Symmetries and Invariances

We study how permutation symmetries in overparameterized multi-layer neu...
research
05/13/2018

The Global Optimization Geometry of Shallow Linear Neural Networks

We examine the squared error loss landscape of shallow linear neural net...
research
09/27/2018

Towards the optimal construction of a loss function without spurious local minima for solving quadratic equations

The problem of finding a vector x which obeys a set of quadratic equatio...
research
08/22/2023

A free from local minima algorithm for training regressive MLP neural networks

In this article an innovative method for training regressive MLP network...

Please sign up or login with your details

Forgot password? Click here to reset