On the loss landscape of a class of deep neural networks with no bad local valleys

09/27/2018
by   Quynh Nguyen, et al.
0

We identify a class of over-parameterized deep neural networks with standard activation functions and cross-entropy loss which provably have no bad local valley, in the sense that from any point in parameter space there exists a continuous path on which the cross-entropy loss is non-increasing and gets arbitrarily close to zero. This implies that these networks have no sub-optimal strict local minima.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/28/2018

Over-Parameterized Deep Neural Networks Have No Strict Local Minima For Any Continuous Activations

In this paper, we study the loss surface of the over-parameterized fully...
research
11/09/2020

Numerical Exploration of Training Loss Level-Sets in Deep Neural Networks

We present a computational method for empirically characterizing the tra...
research
12/31/2019

Revisiting Landscape Analysis in Deep Neural Networks: Eliminating Decreasing Paths to Infinity

Traditional landscape analysis of deep neural networks aims to show that...
research
07/02/2020

The Global Landscape of Neural Networks: An Overview

One of the major concerns for neural network training is that the non-co...
research
11/04/2019

Sub-Optimal Local Minima Exist for Almost All Over-parameterized Neural Networks

Does over-parameterization eliminate sub-optimal local minima for neural...
research
12/25/2020

Adaptively Solving the Local-Minimum Problem for Deep Neural Networks

This paper aims to overcome a fundamental problem in the theory and appl...
research
11/30/2021

Embedding Principle: a hierarchical structure of loss landscape of deep neural networks

We prove a general Embedding Principle of loss landscape of deep neural ...

Please sign up or login with your details

Forgot password? Click here to reset