Neural network integral representations with the ReLU activation function

10/07/2019
by   Anton Dereventsov, et al.
0

We derive a formula for neural network integral representations on the sphere with the ReLU activation function under the finite L_1 norm (with respect to Lebesgue measure on the sphere) assumption on the outer weights. In one dimensional case, we further solve via a closed-form formula all possible such representations. Additionally, in this case our formula allows one to explicitly solve the least L_1 norm neural network representation for a given function.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/20/2021

Integral representations of shallow neural network with Rectified Power Unit activation function

In this effort, we derive a formula for the integral representation of a...
research
06/23/2022

Hausdorff Distance between Norm Balls and their Linear Maps

We consider the problem of computing the (two-sided) Hausdorff distance ...
research
06/25/2020

Q-NET: A Formula for Numerical Integration of a Shallow Feed-forward Neural Network

Numerical integration is a computational procedure that is widely encoun...
research
07/23/2020

Nonclosedness of the Set of Neural Networks in Sobolev Space

We examine the closedness of the set of realized neural networks of a fi...
research
11/15/2022

Characterizing the Spectrum of the NTK via a Power Series Expansion

Under mild conditions on the network initialization we derive a power se...
research
05/24/2019

Greedy Shallow Networks: A New Approach for Constructing and Training Neural Networks

We present a novel greedy approach to obtain a single layer neural netwo...
research
10/05/2022

Rediscovery of Numerical Lüscher's Formula from the Neural Network

We present that by predicting the spectrum in discrete space from the ph...

Please sign up or login with your details

Forgot password? Click here to reset