Points of non-linearity of functions generated by random neural networks

04/19/2023
by   David Holmes, et al.
0

We consider functions from the real numbers to the real numbers, output by a neural network with 1 hidden activation layer, arbitrary width, and ReLU activation function. We assume that the parameters of the neural network are chosen uniformly at random with respect to various probability distributions, and compute the expected distribution of the points of non-linearity. We use these results to explain why the network may be biased towards outputting functions with simpler geometry, and why certain functions with low information-theoretic complexity are nonetheless hard for a neural network to approximate.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/19/2020

No one-hidden-layer neural network can represent multivariable functions

In a function approximation with a neural network, an input dataset is m...
research
02/10/2020

On Approximation Capabilities of ReLU Activation and Softmax Output Layer in Neural Networks

In this paper, we have extended the well-established universal approxima...
research
01/11/2022

Deep Neural Network Approximation For Hölder Functions

In this work, we explore the approximation capability of deep Rectified ...
research
12/03/2022

Probabilistic Verification of ReLU Neural Networks via Characteristic Functions

Verifying the input-output relationships of a neural network so as to ac...
research
05/24/2022

Taming the sign problem of explicitly antisymmetrized neural networks via rough activation functions

Explicit antisymmetrization of a two-layer neural network is a potential...
research
09/30/2018

Deep, Skinny Neural Networks are not Universal Approximators

In order to choose a neural network architecture that will be effective ...
research
11/30/2021

Approximate Spectral Decomposition of Fisher Information Matrix for Simple ReLU Networks

We investigate the Fisher information matrix (FIM) of one hidden layer n...

Please sign up or login with your details

Forgot password? Click here to reset