A multivariate Riesz basis of ReLU neural networks

02/28/2023
by   Cornelia Schneider, et al.
0

We consider the trigonometric-like system of piecewise linear functions introduced recently by Daubechies, DeVore, Foucart, Hanin, and Petrova. We provide an alternative proof that this system forms a Riesz basis of L_2([0,1]) based on the Gershgorin theorem. We also generalize this system to higher dimensions d>1 by a construction, which avoids using (tensor) products. As a consequence, the functions from the new Riesz basis of L_2([0,1]^d) can be easily represented by neural networks. Moreover, the Riesz constants of this system are independent of d, making it an attractive building block regarding future multivariate analysis of neural networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/24/2023

How Jellyfish Characterise Alternating Group Equivariant Neural Networks

We provide a full characterisation of all of the possible alternating gr...
research
06/27/2019

Error bounds for deep ReLU networks using the Kolmogorov--Arnold superposition theorem

We prove a theorem concerning the approximation of multivariate continuo...
research
03/10/2022

Stable Parametrization of Continuous and Piecewise-Linear Functions

Rectified-linear-unit (ReLU) neural networks, which play a prominent rol...
research
05/23/2022

Decoupling multivariate functions using a nonparametric filtered tensor decomposition

Multivariate functions emerge naturally in a wide variety of data-driven...
research
07/15/2023

Graph Automorphism Group Equivariant Neural Networks

For any graph G having n vertices and its automorphism group Aut(G), we ...
research
05/16/2023

Unwrapping All ReLU Networks

Deep ReLU Networks can be decomposed into a collection of linear models,...
research
03/06/2019

Positively Scale-Invariant Flatness of ReLU Neural Networks

It was empirically confirmed by Keskar et al.SharpMinima that flatter mi...

Please sign up or login with your details

Forgot password? Click here to reset