DeepAI AI Chat
Log In Sign Up

Learning Stochastic Differential Equations With Gaussian Processes Without Gradient Matching

07/16/2018
by   Cagatay Yildiz, et al.
0

We introduce a novel paradigm for learning non-parametric drift and diffusion functions for stochastic differential equation (SDE) that are learnt to simulate trajectory distributions that match observations of arbitrary spacings. This is in contrast to existing gradient matching or other approximations that do not optimize simulated responses. We demonstrate that our general stochastic distribution optimisation leads to robust and efficient learning of SDE systems.

READ FULL TEXT

page 1

page 2

page 3

page 4

04/14/2017

Non-parametric Estimation of Stochastic Differential Equations with Sparse Gaussian Processes

The application of Stochastic Differential Equations (SDEs) to the analy...
10/29/2021

Scalable Inference in SDEs by Direct Matching of the Fokker-Planck-Kolmogorov Equation

Simulation-based techniques such as variants of stochastic Runge-Kutta a...
12/01/2020

New Algorithms And Fast Implementations To Approximate Stochastic Processes

We present new algorithms and fast implementations to find efficient app...
06/23/2022

Stochastic Langevin Differential Inclusions with Applications to Machine Learning

Stochastic differential equations of Langevin-diffusion form have receiv...
02/22/2019

AReS and MaRS - Adversarial and MMD-Minimizing Regression for SDEs

Stochastic differential equations are an important modeling class in man...
10/17/2022

Parametric estimation of stochastic differential equations via online gradient descent

We propose an online parametric estimation method of stochastic differen...
06/16/2020

Deterministic Inference of Neural Stochastic Differential Equations

Model noise is known to have detrimental effects on neural networks, suc...