On bounds for norms of reparameterized ReLU artificial neural network parameters: sums of fractional powers of the Lipschitz norm control the network parameter vector

06/27/2022
by   Arnulf Jentzen, et al.
0

It is an elementary fact in the scientific literature that the Lipschitz norm of the realization function of a feedforward fully-connected rectified linear unit (ReLU) artificial neural network (ANN) can, up to a multiplicative constant, be bounded from above by sums of powers of the norm of the ANN parameter vector. Roughly speaking, in this work we reveal in the case of shallow ANNs that the converse inequality is also true. More formally, we prove that the norm of the equivalence class of ANN parameter vectors with the same realization function is, up to a multiplicative constant, bounded from above by the sum of powers of the Lipschitz norm of the ANN realization function (with the exponents 1/2 and 1). Moreover, we prove that this upper bound only holds when employing the Lipschitz norm but does neither hold for Hölder norms nor for Sobolev-Slobodeckij norms. Furthermore, we prove that this upper bound only holds for sums of powers of the Lipschitz norm with the exponents 1/2 and 1 but does not hold for the Lipschitz norm alone.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/13/2022

Normalized gradient flow optimization in the training of ReLU artificial neural networks

The training of artificial neural networks (ANNs) is nowadays a highly r...
research
03/06/2023

Expressivity of Shallow and Deep Neural Networks for Polynomial Approximation

We analyze the number of neurons that a ReLU neural network needs to app...
research
03/02/2020

Exactly Computing the Local Lipschitz Constant of ReLU Networks

The Lipschitz constant of a neural network is a useful metric for provab...
research
08/14/2020

Analytical bounds on the local Lipschitz constants of affine-ReLU functions

In this paper, we determine analytical bounds on the local Lipschitz con...
research
07/20/2021

An Embedding of ReLU Networks and an Analysis of their Identifiability

Neural networks with the Rectified Linear Unit (ReLU) nonlinearity are d...
research
05/24/2021

Coercivity, essential norms, and the Galerkin method for second-kind integral equations on polyhedral and Lipschitz domains

It is well known that, with a particular choice of norm, the classical d...
research
12/27/2021

Sparsest Univariate Learning Models Under Lipschitz Constraint

Beside the minimization of the prediction error, two of the most desirab...

Please sign up or login with your details

Forgot password? Click here to reset