TorchDyn: A Neural Differential Equations Library

09/20/2020
by   Michael Poli, et al.
0

Continuous-depth learning has recently emerged as a novel perspective on deep learning, improving performance in tasks related to dynamical systems and density estimation. Core to these approaches is the neural differential equation, whose forward passes are the solutions of an initial value problem parametrized by a neural network. Unlocking the full potential of continuous-depth models requires a different set of software tools, due to peculiar differences compared to standard discrete neural networks, e.g inference must be carried out via numerical solvers. We introduce TorchDyn, a PyTorch library dedicated to continuous-depth learning, designed to elevate neural differential equations to be as accessible as regular plug-and-play deep learning primitives. This objective is achieved by identifying and subdividing different variants into common essential components, which can be combined and freely repurposed to obtain complex compositional architectures. TorchDyn further offers step-by-step tutorials and benchmarks designed to guide researchers and contributors.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/04/2022

On Neural Differential Equations

The conjoining of dynamical systems and deep learning has become a topic...
research
02/19/2020

Dissecting Neural ODEs

Continuous deep learning architectures have recently re-emerged as varia...
research
07/19/2020

Hypersolvers: Toward Fast Continuous-Depth Models

The infinite-depth paradigm pioneered by Neural ODEs has launched a rena...
research
06/18/2020

A Shooting Formulation of Deep Learning

Continuous-depth neural networks can be viewed as deep limits of discret...
research
07/04/2021

Learning ODEs via Diffeomorphisms for Fast and Robust Integration

Advances in differentiable numerical integrators have enabled the use of...
research
06/24/2021

Sparse Flows: Pruning Continuous-depth Models

Continuous deep learning architectures enable learning of flexible proba...
research
08/24/2020

Local error quantification for efficient neural network dynamical system solvers

Neural Networks have been identified as potentially powerful tools for t...

Please sign up or login with your details

Forgot password? Click here to reset