Embedding Principle: a hierarchical structure of loss landscape of deep neural networks

11/30/2021
by   Yaoyu Zhang, et al.
0

We prove a general Embedding Principle of loss landscape of deep neural networks (NNs) that unravels a hierarchical structure of the loss landscape of NNs, i.e., loss landscape of an NN contains all critical points of all the narrower NNs. This result is obtained by constructing a class of critical embeddings which map any critical point of a narrower NN to a critical point of the target NN with the same output function. By discovering a wide class of general compatible critical embeddings, we provide a gross estimate of the dimension of critical submanifolds embedded from critical points of narrower NNs. We further prove an irreversiblility property of any critical embedding that the number of negative/zero/positive eigenvalues of the Hessian matrix of a critical point may increase but never decrease as an NN becomes wider through the embedding. Using a special realization of general compatible critical embedding, we prove a stringent necessary condition for being a "truly-bad" critical point that never becomes a strict-saddle point through any critical embedding. This result implies the commonplace of strict-saddle points in wide NNs, which may be an important reason underlying the easy optimization of wide NNs widely observed in practice.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/26/2022

Embedding Principle in Depth for the Loss Landscape Analysis of Deep Neural Networks

Unraveling the general structure underlying the loss landscapes of deep ...
research
05/08/2020

The critical locus of overparameterized neural networks

Many aspects of the geometry of loss functions in deep learning remain m...
research
05/30/2021

Embedding Principle of Loss Landscape of Deep Neural Networks

Understanding the structure of loss landscape of deep neural networks (D...
research
09/27/2018

On the loss landscape of a class of deep neural networks with no bad local valleys

We identify a class of over-parameterized deep neural networks with stan...
research
02/27/2020

The Landscape of Matrix Factorization Revisited

We revisit the landscape of the simple matrix factorization problem. For...
research
06/12/2019

Semi-flat minima and saddle points by embedding neural networks to overparameterization

We theoretically study the landscape of the training error for neural ne...
research
12/16/2019

A Deep Neural Network's Loss Surface Contains Every Low-dimensional Pattern

The work "Loss Landscape Sightseeing with Multi-Point Optimization" (Sko...

Please sign up or login with your details

Forgot password? Click here to reset