Universal flow approximation with deep residual networks

10/21/2019 ∙ by Johannes Müller, et al. ∙ 0

Residual networks (ResNets) are a deep learning architecture with the recursive structure x_k+1 = x_k + R_k(x_k) where R_k is a neural network and the copying of the input x_k is called a skip connection. This structure can be seen as the explicit Euler discretisation of an associated ordinary differential equation. We use this interpretation to show that by simultaneously increasing the number of skip connection as well as the expressivity of the networks R_k the flow of an arbitrary right hand side f∈ L^1( I; C_b^0, 1(R^d; R^d)) can be approximated uniformly by deep ReLU ResNets on compact sets. Further, we derive estimates on the number of parameters needed to do this up to a prescribed accuracy under temporal regularity assumptions. Finally, we discuss the possibility of using ResNets for diffeomorphic matching problems and propose some next steps in the theoretical foundation of this approach.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.