Deep neural network approximation of composite functions without the curse of dimensionality

04/12/2023
by   Adrian Riekert, et al.
0

In this article we identify a general class of high-dimensional continuous functions that can be approximated by deep neural networks (DNNs) with the rectified linear unit (ReLU) activation without the curse of dimensionality. In other words, the number of DNN parameters grows at most polynomially in the input dimension and the approximation error. The functions in our class can be expressed as a potentially unbounded number of compositions of special functions which include products, maxima, and certain parallelized Lipschitz continuous functions.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset