DeepAI AI Chat
Log In Sign Up

On the Expressive Power of Deep Neural Networks

by   Maithra Raghu, et al.

We propose a new approach to the problem of neural network expressivity, which seeks to characterize how structural properties of a neural network family affect the functions it is able to compute. Our approach is based on an interrelated set of measures of expressivity, unified by the novel notion of trajectory length, which measures how the output of a network changes as the input sweeps along a one-dimensional path. Our findings can be summarized as follows: (1) The complexity of the computed function grows exponentially with depth. (2) All weights are not equal: trained networks are more sensitive to their lower (initial) layer weights. (3) Regularizing on trajectory length (trajectory regularization) is a simpler alternative to batch normalization, with the same performance.


page 1

page 2

page 3

page 4


Survey of Expressivity in Deep Neural Networks

We survey results on neural network expressivity described in "On the Ex...

Trajectory growth lower bounds for random sparse deep ReLU networks

This paper considers the growth in the length of one-dimensional traject...

Robust Large Margin Deep Neural Networks

The generalization error of deep neural networks via their classificatio...

Deep ReLU Networks Preserve Expected Length

Assessing the complexity of functions computed by a neural network helps...

Net-Trim: Convex Pruning of Deep Neural Networks with Performance Guarantee

We introduce and analyze a new technique for model reduction for deep ne...

Eternal Sunshine of the Spotless Net: Selective Forgetting in Deep Neural Networks

We explore the problem of selectively forgetting a particular set of dat...