Complexity of Feed-Forward Neural Networks from the Perspective of Functional Equivalence

05/19/2023
by   Guohao Shen, et al.
0

In this paper, we investigate the complexity of feed-forward neural networks by examining the concept of functional equivalence, which suggests that different network parameterizations can lead to the same function. We utilize the permutation invariance property to derive a novel covering number bound for the class of feedforward neural networks, which reveals that the complexity of a neural network can be reduced by exploiting this property. Furthermore, based on the symmetric structure of parameter space, we demonstrate that an appropriate strategy of random parameter initialization can increase the probability of convergence for optimization. We found that overparameterized networks tend to be easier to train in the sense that increasing the width of neural networks leads to a vanishing volume of the effective parameter space. Our findings offer new insights into overparameterization and have significant implications for understanding generalization and optimization in deep learning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/08/2023

Functional Equivalence and Path Connectivity of Reducible Hyperbolic Tangent Networks

Understanding the learning process of artificial neural networks require...
research
12/08/2016

Geometric Decomposition of Feed Forward Neural Networks

There have been several attempts to mathematically understand neural net...
research
09/08/2022

Functional dimension of feedforward ReLU neural networks

It is well-known that the parameterized family of functions representabl...
research
02/14/2023

The Geometry of Neural Nets' Parameter Spaces Under Reparametrization

Model reparametrization – transforming the parameter space via a bijecti...
research
03/31/2021

Using activation histograms to bound the number of affine regions in ReLU feed-forward neural networks

Several current bounds on the maximal number of affine regions of a ReLU...
research
08/16/2020

A Functional Perspective on Learning Symmetric Functions with Neural Networks

Symmetric functions, which take as input an unordered, fixed-size set, a...
research
09/17/2015

Some Theorems for Feed Forward Neural Networks

In this paper we introduce a new method which employs the concept of "Or...

Please sign up or login with your details

Forgot password? Click here to reset