Locally Regularized Neural Differential Equations: Some Black Boxes Were Meant to Remain Closed!

03/03/2023
by   Avik Pal, et al.
0

Implicit layer deep learning techniques, like Neural Differential Equations, have become an important modeling framework due to their ability to adapt to new problems automatically. Training a neural differential equation is effectively a search over a space of plausible dynamical systems. However, controlling the computational cost for these models is difficult since it relies on the number of steps the adaptive solver takes. Most prior works have used higher-order methods to reduce prediction timings while greatly increasing training time or reducing both training and prediction timings by relying on specific training algorithms, which are harder to use as a drop-in replacement due to strict requirements on automatic differentiation. In this manuscript, we use internal cost heuristics of adaptive differential equation solvers at stochastic time points to guide the training toward learning a dynamical system that is easier to integrate. We "close the black-box" and allow the use of our method with any adjoint technique for gradient calculations of the differential equation solution. We perform experimental studies to compare our method to global regularization to show that we attain similar performance numbers without compromising the flexibility of implementation on ordinary differential equations (ODEs) and stochastic differential equations (SDEs). We develop two sampling strategies to trade off between performance and training time. Our method reduces the number of function evaluations to 0.556-0.733x and accelerates predictions by 1.3-2x.

READ FULL TEXT
research
05/09/2021

Opening the Blackbox: Accelerating Neural Differential Equations by Regularizing Internal Solver Heuristics

Democratization of machine learning requires architectures that automati...
research
11/13/2022

Experimental study of Neural ODE training with adaptive solver for dynamical systems modeling

Neural Ordinary Differential Equations (ODEs) was recently introduced as...
research
09/20/2020

"Hey, that's not an ODE": Faster ODE Adjoints with 12 Lines of Code

Neural differential equations may be trained by backpropagating gradient...
research
05/04/2022

Virtual Analog Modeling of Distortion Circuits Using Neural Ordinary Differential Equations

Recent research in deep learning has shown that neural networks can lear...
research
02/09/2021

On Theory-training Neural Networks to Infer the Solution of Highly Coupled Differential Equations

Deep neural networks are transforming fields ranging from computer visio...
research
07/10/2022

Automatic differentiation and the optimization of differential equation models in biology

A computational revolution unleashed the power of artificial neural netw...
research
06/18/2020

STEER : Simple Temporal Regularization For Neural ODEs

Training Neural Ordinary Differential Equations (ODEs) is often computat...

Please sign up or login with your details

Forgot password? Click here to reset