A Theoretical Analysis of Deep Neural Networks and Parametric PDEs

03/31/2019
by   Gitta Kutyniok, et al.
0

We derive upper bounds on the complexity of ReLU neural networks approximating the solution maps of parametric partial differential equations. In particular, without any knowledge of its concrete shape, we use the inherent low-dimensionality of the solution manifold to obtain approximation rates which are significantly superior to those provided by classical approximation results. We use this low dimensionality to guarantee the existence of a reduced basis. Then, for a large variety of parametric partial differential equations, we construct neural networks that yield approximations of the parametric maps not suffering from a curse of dimension and essentially only depending on the size of the reduced basis.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset