Deep ReLU Networks Have Surprisingly Few Activation Patterns

06/03/2019
by   Boris Hanin, et al.
0

The success of deep networks has been attributed in part to their expressivity: per parameter, deep networks can approximate a richer class of functions than shallow networks. In ReLU networks, the number of activation patterns is one measure of expressivity; and the maximum number of patterns grows exponentially with the depth. However, recent work has showed that the practical expressivity of deep networks - the functions they can learn rather than express - is often far from the theoretical maximum. In this paper, we show that the average number of activation patterns for ReLU networks at initialization is bounded by the total number of neurons raised to the input dimension. We show empirically that this bound, which is independent of the depth, is tight both at initialization and during training, even on memorization tasks that should maximize the number of activation patterns. Our work suggests that realizing the full expressivity of deep networks may not be possible in practice, at least with current methods.

READ FULL TEXT

page 2

page 8

research
01/25/2019

Complexity of Linear Regions in Deep Networks

It is well-known that the expressivity of a neural network depends on it...
research
04/08/2019

On the Learnability of Deep Random Networks

In this paper we study the learnability of deep random networks from bot...
research
07/01/2021

On the Expected Complexity of Maxout Networks

Learning with neural networks relies on the complexity of the representa...
research
11/30/2022

Average Path Length: Sparsification of Nonlinearties Creates Surprisingly Shallow Networks

We perform an empirical study of the behaviour of deep networks when pus...
research
10/20/2020

Smooth activations and reproducibility in deep networks

Deep networks are gradually penetrating almost every domain in our lives...
research
02/21/2021

Deep ReLU Networks Preserve Expected Length

Assessing the complexity of functions computed by a neural network helps...
research
02/21/2021

Synthesizing Irreproducibility in Deep Networks

The success and superior performance of deep networks is spreading their...

Please sign up or login with your details

Forgot password? Click here to reset