On the Optimal Expressive Power of ReLU DNNs and Its Application in Approximation with Kolmogorov Superposition Theorem

08/10/2023
by   Juncai He, et al.
0

This paper is devoted to studying the optimal expressive power of ReLU deep neural networks (DNNs) and its application in approximation via the Kolmogorov Superposition Theorem. We first constructively prove that any continuous piecewise linear functions on [0,1], comprising O(N^2L) segments, can be represented by ReLU DNNs with L hidden layers and N neurons per layer. Subsequently, we demonstrate that this construction is optimal regarding the parameter count of the DNNs, achieved through investigating the shattering capacity of ReLU DNNs. Moreover, by invoking the Kolmogorov Superposition Theorem, we achieve an enhanced approximation rate for ReLU DNNs of arbitrary width and depth when dealing with continuous functions in high-dimensional spaces.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/10/2021

ReLU Deep Neural Networks from the Hierarchical Basis Perspective

We study ReLU deep neural networks (DNNs) by investigating their connect...
research
06/25/2018

Analysis of Invariance and Robustness via Invertibility of ReLU-Networks

Studying the invertibility of deep neural networks (DNNs) provides a pri...
research
06/27/2019

Error bounds for deep ReLU networks using the Kolmogorov--Arnold superposition theorem

We prove a theorem concerning the approximation of multivariate continuo...
research
05/31/2023

On the Expressive Power of Neural Networks

In 1989 George Cybenko proved in a landmark paper that wide shallow neur...
research
12/09/2019

Depth-Width Trade-offs for ReLU Networks via Sharkovsky's Theorem

Understanding the representational power of Deep Neural Networks (DNNs) ...
research
02/26/2019

Nonlinear Approximation via Compositions

We study the approximation efficiency of function compositions in nonlin...
research
02/23/2021

Deep ReLU Neural Network Approximation for Stochastic Differential Equations with Jumps

Deep neural networks (DNNs) with ReLU activation function are proved to ...

Please sign up or login with your details

Forgot password? Click here to reset