On Connected Sublevel Sets in Deep Learning

01/22/2019
by   Quynh Nguyen, et al.
0

We study sublevel sets of the loss function in training deep neural networks. For linearly independent data, we prove that every sublevel set of the loss is connected and unbounded. We then apply this result to prove similar properties on the loss surface of deep over-parameterized neural nets with piecewise linear activation functions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/28/2018

Over-Parameterized Deep Neural Networks Have No Strict Local Minima For Any Continuous Activations

In this paper, we study the loss surface of the over-parameterized fully...
research
08/03/2020

Low-loss connection of weight vectors: distribution-based approaches

Recent research shows that sublevel sets of the loss surfaces of overpar...
research
03/27/2020

Piecewise linear activations substantially shape the loss surfaces of neural networks

Understanding the loss surface of a neural network is fundamentally impo...
research
01/21/2021

A Note on Connectivity of Sublevel Sets in Deep Learning

It is shown that for deep neural networks, a single wide layer of width ...
research
05/18/2018

Reconstruction of training samples from loss functions

This paper presents a new mathematical framework to analyze the loss fun...
research
08/27/2023

The inverse problem for neural networks

We study the problem of computing the preimage of a set under a neural n...
research
10/07/2018

Principled Deep Neural Network Training through Linear Programming

Deep Learning has received significant attention due to its impressive p...

Please sign up or login with your details

Forgot password? Click here to reset