Scalable Gradients for Stochastic Differential Equations

01/05/2020
by   Xuechen Li, et al.
19

The adjoint sensitivity method scalably computes gradients of solutions to ordinary differential equations. We generalize this method to stochastic differential equations, allowing time-efficient and constant-memory computation of gradients with high-order adaptive solvers. Specifically, we derive a stochastic differential equation whose solution is the gradient, a memory-efficient algorithm for caching noise, and conditions under which numerical solutions converge. In addition, we combine our method with gradient-based stochastic variational inference for latent stochastic differential equations. We use our method to fit stochastic dynamics defined by neural networks, achieving competitive performance on a 50-dimensional motion capture dataset.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 23

02/12/2021

Infinitely Deep Bayesian Neural Networks with Stochastic Differential Equations

We perform scalable approximate inference in a recently-proposed family ...
02/06/2019

DiffEqFlux.jl - A Julia Library for Neural Differential Equations

DiffEqFlux.jl is a library for fusing neural networks and differential e...
05/17/2021

Adaptive Density Tracking by Quadrature for Stochastic Differential Equations

Density tracking by quadrature (DTQ) is a numerical procedure for comput...
03/23/2018

From Random Differential Equations to Structural Causal Models: the stochastic case

Random Differential Equations provide a natural extension of Ordinary Di...
06/17/2020

Order conditions for sampling the invariant measure of ergodic stochastic differential equations on manifolds

We derive a new methodology for the construction of high order integrato...
06/16/2021

Fréchet derivatives of expected functionals of solutions to stochastic differential equations

In the analysis of stochastic dynamical systems described by stochastic ...
05/19/2017

Scalable Variational Inference for Dynamical Systems

Gradient matching is a promising tool for learning parameters and state ...

Code Repositories

torchsde

Differentiable SDE solvers with GPU support and efficient sensitivity analysis.


view repo

bayesian-sde

Code for "Infinitely Deep Bayesian Neural Networks with Stochastic Differential Equations"


view repo
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.