DeepAI AI Chat
Log In Sign Up

Approximation Capabilities of Neural Ordinary Differential Equations

07/30/2019
by   Han Zhang, et al.
Virginia Commonwealth University
0

Neural Ordinary Differential Equations have been recently proposed as an infinite-depth generalization of residual networks. Neural ODEs provide out-of-the-box invertibility of the mapping realized by the neural network, and can lead to networks that are more efficient in terms of computational time and parameter space. Here, we show that a Neural ODE operating on a space with dimensionality increased by one compared to the input dimension is a universal approximator for the space of continuous functions, at the cost of loosing invertibility. We then turn our focus to invertible mappings, and we prove that any homeomorphism on a p-dimensional Euclidean space can be approximated by a Neural ODE operating on a (2p+1)-dimensional Euclidean space.

READ FULL TEXT

page 1

page 2

page 3

page 4

12/04/2020

Universal Approximation Property of Neural Ordinary Differential Equations

Neural ordinary differential equations (NODEs) is an invertible neural n...
04/02/2019

Augmented Neural ODEs

We show that Neural Ordinary Differential Equations (ODEs) learn represe...
01/04/2022

Neural Piecewise-Constant Delay Differential Equations

Continuous-depth neural networks, such as the Neural Ordinary Differenti...
02/19/2019

Continuous Ordinary Differential Equations and Infinite Time Turing Machines

We consider Continuous Ordinary Differential Equations (CODE) y'=f(y), w...
05/24/2018

Residual Networks as Geodesic Flows of Diffeomorphisms

This paper addresses the understanding and characterization of residual ...
11/02/2018

Invertible Residual Networks

Reversible deep networks provide useful theoretical guarantees and have ...