Error analysis for deep neural network approximations of parametric hyperbolic conservation laws

07/15/2022
by   Tim De Ryck, et al.
0

We derive rigorous bounds on the error resulting from the approximation of the solution of parametric hyperbolic scalar conservation laws with ReLU neural networks. We show that the approximation error can be made as small as desired with ReLU neural networks that overcome the curse of dimensionality. In addition, we provide an explicit upper bound on the generalization error in terms of the training error, number of training samples and the neural network size. The theoretical results are illustrated by numerical experiments.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/02/2023

A-posteriori error estimates for systems of hyperbolic conservation laws

We provide rigorous and computable a-posteriori error estimates for firs...
research
05/27/2019

Quantifying the generalization error in deep learning in terms of data distribution and neural network smoothness

The accuracy of deep learning, i.e., deep neural networks, can be charac...
research
04/19/2023

Generalization and Estimation Error Bounds for Model-based Neural Networks

Model-based neural networks provide unparalleled performance for various...
research
05/26/2022

Training ReLU networks to high uniform accuracy is intractable

Statistical learning theory provides bounds on the necessary number of t...
research
12/09/2019

Error control for statistical solutions

Statistical solutions have recently been introduced as a an alternative ...
research
06/07/2020

RoeNets: Predicting Discontinuity of Hyperbolic Systems from Continuous Data

We introduce Roe Neural Networks (RoeNets) that can predict the disconti...
research
10/18/2019

Towards Quantifying Intrinsic Generalization of Deep ReLU Networks

Understanding the underlying mechanisms that enable the empirical succes...

Please sign up or login with your details

Forgot password? Click here to reset