
Is the Skip Connection Provable to Reform the Neural Network Loss Landscape?
The residual network is now one of the most effective structures in deep...
read it

Deep Residual Networks with Exponential Linear Unit
Very deep convolutional neural networks introduced new problems like van...
read it

Universal Approximation Power of Deep Neural Networks via Nonlinear Control Theory
In this paper, we explain the universal approximation capabilities of de...
read it

Deep Diffeomorphic Normalizing Flows
The Normalizing Flow (NF) models a general probability density by estima...
read it

Iterative Residual Refinement for Joint Optical Flow and Occlusion Estimation
Deep learning approaches to optical flow estimation have seen rapid prog...
read it

ANODE: Unconditionally Accurate MemoryEfficient Gradients for Neural ODEs
Residual neural networks can be viewed as the forward Euler discretizati...
read it

Discrete and Continuous Deep Residual Learning Over Graphs
In this paper we propose the use of continuous residual modules for grap...
read it
Universal flow approximation with deep residual networks
Residual networks (ResNets) are a deep learning architecture with the recursive structure x_k+1 = x_k + R_k(x_k) where R_k is a neural network and the copying of the input x_k is called a skip connection. This structure can be seen as the explicit Euler discretisation of an associated ordinary differential equation. We use this interpretation to show that by simultaneously increasing the number of skip connection as well as the expressivity of the networks R_k the flow of an arbitrary right hand side f∈ L^1( I; C_b^0, 1(R^d; R^d)) can be approximated uniformly by deep ReLU ResNets on compact sets. Further, we derive estimates on the number of parameters needed to do this up to a prescribed accuracy under temporal regularity assumptions. Finally, we discuss the possibility of using ResNets for diffeomorphic matching problems and propose some next steps in the theoretical foundation of this approach.
READ FULL TEXT
Comments
There are no comments yet.