Algorithmic Linearly Constrained Gaussian Processes

01/28/2018
by   Markus Lange-Hegermann, et al.
0

We algorithmically construct multi-output Gaussian process priors which satisfy linear differential equations. Our approach attempts to parametrize all solutions of the equations using Gröbner bases. If successful, a push forward Gaussian process along the paramerization is the desired prior. We consider several examples, among them the full inhomogeneous system of Maxwell's equations. By bringing together stochastic learning and computeralgebra in a novel way, we combine noisy observations with precise algebraic computations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/03/2020

Linearly Constrained Gaussian Processes with Boundary Conditions

One goal in Bayesian machine learning is to encode prior knowledge into ...
research
04/23/2020

Coarsening in Algebraic Multigrid using Gaussian Processes

Multigrid methods have proven to be an invaluable tool to efficiently so...
research
01/10/2017

Machine Learning of Linear Differential Equations using Gaussian Processes

This work leverages recent advances in probabilistic machine learning to...
research
02/05/2020

Linearly Constrained Neural Networks

We present an approach to designing neural network based models that wil...
research
09/16/2020

Inference of dynamic systems from noisy and sparse data via manifold-constrained Gaussian processes

Parameter estimation for nonlinear dynamic system models, represented by...
research
05/29/2018

Iterative Statistical Linear Regression for Gaussian Smoothing in Continuous-Time Non-linear Stochastic Dynamic Systems

This paper considers approximate smoothing for discretely observed non-l...
research
06/08/2020

Multi-Fidelity High-Order Gaussian Processes for Physical Simulation

The key task of physical simulation is to solve partial differential equ...

Please sign up or login with your details

Forgot password? Click here to reset