Function approximation by deep networks

05/30/2019
by   H. N. Mhaskar, et al.
0

We show that deep networks are better than shallow networks at approximating functions that can be expressed as a composition of functions described by a directed acyclic graph, because the deep networks can be designed to have the same compositional structure, while a shallow network cannot exploit this knowledge. Thus, the blessing of compositionality mitigates the curse of dimensionality. On the other hand, a theorem called good propagation of errors allows to `lift' theorems about shallow networks to those about deep networks with an appropriate choice of norms, smoothness, etc. We illustrate this in three contexts where each channel in the deep network calculates a spherical polynomial, a non-smooth ReLU network, or another zonal function network related closely with the ReLU network.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/26/2019

Dimension independent bounds for general shallow networks

This paper proves an abstract theorem addressing in a unified manner two...
research
06/27/2019

Error bounds for deep ReLU networks using the Kolmogorov--Arnold superposition theorem

We prove a theorem concerning the approximation of multivariate continuo...
research
05/31/2023

On the Expressive Power of Neural Networks

In 1989 George Cybenko proved in a landmark paper that wide shallow neur...
research
06/20/2023

Any Deep ReLU Network is Shallow

We constructively prove that every deep ReLU network can be rewritten as...
research
06/09/2015

Inverting Visual Representations with Convolutional Networks

Feature representations, both hand-designed and learned ones, are often ...
research
10/20/2020

Smooth activations and reproducibility in deep networks

Deep networks are gradually penetrating almost every domain in our lives...
research
06/18/2022

Piecewise Linear Neural Networks and Deep Learning

As a powerful modelling method, PieceWise Linear Neural Networks (PWLNNs...

Please sign up or login with your details

Forgot password? Click here to reset