"Hey, that's not an ODE": Faster ODE Adjoints with 12 Lines of Code

09/20/2020
by   Patrick Kidger, et al.
0

Neural differential equations may be trained by backpropagating gradients via the adjoint method, which is another differential equation typically solved using an adaptive-step-size numerical differential equation solver. A proposed step is accepted if its error, relative to some norm, is sufficiently small; else it is rejected, the step is shrunk, and the process is repeated. Here, we demonstrate that the particular structure of the adjoint equations makes the usual choices of norm (such as L^2) unnecessarily stringent. By replacing it with a more appropriate (semi)norm, fewer steps are unnecessarily rejected and the backpropagation is made faster. This requires only minor code modifications. Experiments on a wide range of tasks—including time series, generative modeling, and physical control—demonstrate a median improvement of 40 function evaluations, so that the overall training time is roughly halved.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/09/2021

Opening the Blackbox: Accelerating Neural Differential Equations by Regularizing Internal Solver Heuristics

Democratization of machine learning requires architectures that automati...
research
03/03/2023

Locally Regularized Neural Differential Equations: Some Black Boxes Were Meant to Remain Closed!

Implicit layer deep learning techniques, like Neural Differential Equati...
research
09/14/2021

Multiple shooting with neural differential equations

Neural differential equations have recently emerged as a flexible data-d...
research
07/03/2023

Understanding the impact of numerical solvers on inference for differential equation models

Most ordinary differential equation (ODE) models used to describe biolog...
research
10/15/2022

Well-definedness of Physical Law Learning: The Uniqueness Problem

Physical law learning is the ambiguous attempt at automating the derivat...
research
07/30/2020

When are Neural ODE Solutions Proper ODEs?

A key appeal of the recently proposed Neural Ordinary Differential Equat...
research
08/31/2023

On the Implicit Bias of Adam

In previous literature, backward error analysis was used to find ordinar...

Please sign up or login with your details

Forgot password? Click here to reset