On the approximation of rough functions with deep neural networks

12/13/2019
by   Tim De Ryck, et al.
0

Deep neural networks and the ENO procedure are both efficient frameworks for approximating rough functions. We prove that at any order, the ENO interpolation procedure can be cast as a deep ReLU neural network. This surprising fact enables the transfer of several desirable properties of the ENO procedure to deep neural networks, including its high-order accuracy at approximating Lipschitz functions. Numerical tests for the resulting neural networks show excellent performance for approximating solutions of nonlinear conservation laws and at data compression.

READ FULL TEXT
research
07/18/2022

wPINNs: Weak Physics informed neural networks for approximating entropy solutions of hyperbolic conservation laws

Physics informed neural networks (PINNs) require regularity of solutions...
research
01/19/2022

Stability of Deep Neural Networks via discrete rough paths

Using rough path techniques, we provide a priori estimates for the outpu...
research
11/06/2019

Neural Network Processing Neural Networks: An efficient way to learn higher order functions

Functions are rich in meaning and can be interpreted in a variety of way...
research
09/24/2015

Provable approximation properties for deep neural networks

We discuss approximation of functions using deep neural nets. Given a fu...
research
07/19/2022

Approximation Power of Deep Neural Networks: an explanatory mathematical survey

The goal of this survey is to present an explanatory review of the appro...
research
08/18/2023

On the Approximation of Bi-Lipschitz Maps by Invertible Neural Networks

Invertible neural networks (INNs) represent an important class of deep n...

Please sign up or login with your details

Forgot password? Click here to reset