Approximation Rates for Neural Networks with Encodable Weights in Smoothness Spaces

06/30/2020
by   Ingo Gühring, et al.
0

We examine the necessary and sufficient complexity of neural networks to approximate functions from different smoothness spaces under the restriction of encodable network weights. Based on an entropy argument, we start by proving lower bounds for the number of nonzero encodable weights for neural network approximation in Besov spaces, Sobolev spaces and more. These results are valid for most practically used (and sufficiently smooth) activation functions. Afterwards, we derive almost optimal upper bounds for ELU-neural networks in Sobolev norms up to second-order. This work advances the theory of approximating solutions of partial differential equations by neural networks.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset