Approximation speed of quantized vs. unquantized ReLU neural networks and beyond

05/24/2022
by   Antoine Gonon, et al.
0

We consider general approximation families encompassing ReLU neural networks. On the one hand, we introduce a new property, that we call ∞-encodability, which lays a framework that we use (i) to guarantee that ReLU networks can be uniformly quantized and still have approximation speeds comparable to unquantized ones, and (ii) to prove that ReLU networks share a common limitation with many other approximation families: the approximation speed of a set C is bounded from above by an encoding complexity of C (a complexity well-known for many C's). The property of ∞-encodability allows us to unify and generalize known results in which it was implicitly used. On the other hand, we give lower and upper bounds on the Lipschitz constant of the mapping that associates the weights of a network to the function they represent in L^p. It is given in terms of the width, the depth of the network and a bound on the weight's norm, and it is based on well-known upper bounds on the Lipschitz constants of the functions represented by ReLU networks. This allows us to recover known results, to establish new bounds on covering numbers, and to characterize the accuracy of naive uniform quantization of ReLU networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/10/2018

On the Universal Approximability of Quantized ReLU Neural Networks

Compression is a key step to deploy large neural networks on resource-co...
research
05/25/2023

Data Topology-Dependent Upper Bounds of Neural Network Widths

This paper investigates the relationship between the universal approxima...
research
02/24/2023

Lower Bounds on the Depth of Integral ReLU Neural Networks via Lattice Polytopes

We prove that the set of functions representable by ReLU neural networks...
research
04/29/2021

Analytical bounds on the local Lipschitz constants of ReLU networks

In this paper, we determine analytical upper bounds on the local Lipschi...
research
08/14/2020

Analytical bounds on the local Lipschitz constants of affine-ReLU functions

In this paper, we determine analytical bounds on the local Lipschitz con...
research
07/01/2020

The Restricted Isometry of ReLU Networks: Generalization through Norm Concentration

While regression tasks aim at interpolating a relation on the entire inp...
research
02/21/2021

Deep ReLU Networks Preserve Expected Length

Assessing the complexity of functions computed by a neural network helps...

Please sign up or login with your details

Forgot password? Click here to reset