Machine Learning of Linear Differential Equations using Gaussian Processes

01/10/2017
by   Maziar Raissi, et al.
0

This work leverages recent advances in probabilistic machine learning to discover conservation laws expressed by parametric linear equations. Such equations involve, but are not limited to, ordinary and partial differential, integro-differential, and fractional order operators. Here, Gaussian process priors are modified according to the particular form of such operators and are employed to infer parameters of the linear equations from scarce and possibly noisy observations. Such observations may come from experiments or "black-box" computer simulations.

READ FULL TEXT

page 8

page 11

research
03/29/2017

Numerical Gaussian Processes for Time-dependent and Non-linear Partial Differential Equations

We introduce the concept of numerical Gaussian processes, which we defin...
research
01/28/2018

Algorithmic Linearly Constrained Gaussian Processes

We algorithmically construct multi-output Gaussian process priors which ...
research
02/03/2020

Linearly Constrained Gaussian Processes with Boundary Conditions

One goal in Bayesian machine learning is to encode prior knowledge into ...
research
10/13/2020

Probabilistic simulation of partial differential equations

Computer simulations of differential equations require a time discretiza...
research
02/09/2022

Adjoint-aided inference of Gaussian process driven differential equations

Linear systems occur throughout engineering and the sciences, most notab...
research
05/26/2022

Learning black- and gray-box chemotactic PDEs/closures from agent based Monte Carlo simulation data

We propose a machine learning framework for the data-driven discovery of...
research
08/26/2022

Constraining Gaussian Processes to Systems of Linear Ordinary Differential Equations

Data in many applications follows systems of Ordinary Differential Equat...

Please sign up or login with your details

Forgot password? Click here to reset