Approximation properties of Residual Neural Networks for Kolmogorov PDEs

10/30/2021
by   Jonas Baggenstos, et al.
0

In recent years residual neural networks (ResNets) as introduced by [He, K., Zhang, X., Ren, S., and Sun, J., Proceedings of the IEEE conference on computer vision and pattern recognition (2016), 770-778] have become very popular in a large number of applications, including in image classification and segmentation. They provide a new perspective in training very deep neural networks without suffering the vanishing gradient problem. In this article we show that ResNets are able to approximate solutions of Kolmogorov partial differential equations (PDEs) with constant diffusion and possibly nonlinear drift coefficients without suffering the curse of dimensionality, which is to say the number of parameters of the approximating ResNets grows at most polynomially in the reciprocal of the approximation accuracy ε > 0 and the dimension of the considered PDE d∈ℕ. We adapt a proof in [Jentzen, A., Salimova, D., and Welti, T., Commun. Math. Sci. 19, 5 (2021), 1167-1205] - who showed a similar result for feedforward neural networks (FNNs) - to ResNets. In contrast to FNNs, the Euler-Maruyama approximation structure of ResNets simplifies the construction of the approximating ResNets substantially. Moreover, contrary to the above work, in our proof using ResNets does not require the existence of an FNN (or a ResNet) representing the identity map, which enlarges the set of applicable activation functions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/03/2021

Parametric Complexity Bounds for Approximating PDEs with Neural Networks

Recent empirical results show that deep networks can approximate solutio...
research
10/21/2022

Neural Network Approximations of PDEs Beyond Linearity: Representational Perspective

A burgeoning line of research has developed deep neural networks capable...
research
09/22/2022

Vanilla feedforward neural networks as a discretization of dynamic systems

Deep learning has made significant applications in the field of data sci...
research
07/30/2021

Connections between Numerical Algorithms for PDEs and Neural Networks

We investigate numerous structural connections between numerical algorit...
research
04/13/2018

Representing smooth functions as compositions of near-identity functions with implications for deep network optimization

We show that any smooth bi-Lipschitz h can be represented exactly as a c...

Please sign up or login with your details

Forgot password? Click here to reset