Exponential expressivity in deep neural networks through transient chaos

06/16/2016
by   Ben Poole, et al.
0

We combine Riemannian geometry with the mean field theory of high dimensional chaos to study the nature of signal propagation in generic, deep neural networks with random weights. Our results reveal an order-to-chaos expressivity phase transition, with networks in the chaotic phase computing nonlinear functions whose global curvature grows exponentially with depth but not width. We prove this generic class of deep random functions cannot be efficiently computed by any shallow network, going beyond prior work restricted to the analysis of single functions. Moreover, we formalize and quantitatively demonstrate the long conjectured idea that deep networks can disentangle highly curved manifolds in input space into flat manifolds in hidden space. Our theoretical analysis of the expressive power of deep networks broadly applies to arbitrary nonlinearities, and provides a quantitative underpinning for previously abstract notions about the geometry of deep functions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/22/2018

On the Spectral Bias of Deep Neural Networks

It is well known that over-parametrized deep neural networks (DNNs) are ...
research
10/04/2017

Mean-field theory of input dimensionality reduction in unsupervised deep neural networks

Deep neural networks as powerful tools are widely used in various domain...
research
11/30/2018

Measure, Manifold, Learning, and Optimization: A Theory Of Neural Networks

We present a formal measure-theoretical theory of neural networks (NN) b...
research
06/04/2018

Universal Statistics of Fisher Information in Deep Neural Networks: Mean Field Approach

This study analyzes the Fisher information matrix (FIM) by applying mean...
research
08/29/2016

Why does deep and cheap learning work so well?

We show how the success of deep learning could depend not only on mathem...
research
04/08/2019

On the Learnability of Deep Random Networks

In this paper we study the learnability of deep random networks from bot...
research
05/21/2019

The Geometry of Deep Networks: Power Diagram Subdivision

We study the geometry of deep (neural) networks (DNs) with piecewise aff...

Please sign up or login with your details

Forgot password? Click here to reset