Conceptual capacity and effective complexity of neural networks

03/13/2021
by   Lech Szymanski, et al.
0

We propose a complexity measure of a neural network mapping function based on the diversity of the set of tangent spaces from different inputs. Treating each tangent space as a linear PAC concept we use an entropy-based measure of the bundle of concepts in order to estimate the conceptual capacity of the network. The theoretical maximal capacity of a ReLU network is equivalent to the number of its neurons. In practice however, due to correlations between neuron activities within the network, the actual capacity can be remarkably small, even for very big networks. Empirical evaluations show that this new measure is correlated with the complexity of the mapping function and thus the generalisation capabilities of the corresponding network. It captures the effective, as oppose to the theoretical, complexity of the network function. We also showcase some uses of the proposed measure for analysis and comparison of trained neural network models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/30/2013

Neural Network Capacity for Multilevel Inputs

This paper examines the memory capacity of generalized neural networks. ...
research
05/25/2020

Approximation in shift-invariant spaces with deep ReLU neural networks

We construct deep ReLU neural networks to approximate functions in dilat...
research
06/10/2021

Within-layer Diversity Reduces Generalization Gap

Neural networks are composed of multiple layers arranged in a hierarchic...
research
03/02/2021

Learning with Hyperspherical Uniformity

Due to the over-parameterization nature, neural networks are a powerful ...
research
11/27/2022

Linear Classification of Neural Manifolds with Correlated Variability

Understanding how the statistical and geometric properties of neural act...
research
05/27/2023

Learning Capacity: A Measure of the Effective Dimensionality of a Model

We exploit a formal correspondence between thermodynamics and inference,...
research
07/02/2021

Neural Network Layer Algebra: A Framework to Measure Capacity and Compression in Deep Learning

We present a new framework to measure the intrinsic properties of (deep)...

Please sign up or login with your details

Forgot password? Click here to reset