Incorporating Sum Constraints into Multitask Gaussian Processes

02/03/2022
by   Philipp Pilar, et al.
0

Machine learning models can be improved by adapting them to respect existing background knowledge. In this paper we consider multitask Gaussian processes, with background knowledge in the form of constraints that require a specific sum of the outputs to be constant. This is achieved by conditioning the prior distribution on the constraint fulfillment. The approach allows for both linear and nonlinear constraints. We demonstrate that the constraints are fulfilled with high precision and that the construction can improve the overall prediction accuracy as compared to the standard Gaussian process.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/13/2019

The Use of Gaussian Processes in System Identification

Gaussian processes are used in machine learning to learn input-output ma...
research
01/29/2020

On Constraint Definability in Tractable Probabilistic Models

Incorporating constraints is a major concern in probabilistic machine le...
research
01/29/2023

Sequential Estimation of Gaussian Process-based Deep State-Space Models

We consider the problem of sequential estimation of the unknowns of stat...
research
08/17/2018

Revisiting the proton-radius problem using constrained Gaussian processes

Background: The "proton radius puzzle" refers to an eight-year old probl...
research
10/13/2021

Using Multitask Gaussian Processes to estimate the effect of a targeted effort to remove firearms

Gun violence is a critical public safety concern in the United States. I...
research
09/02/2022

Log-Gaussian processes for AI-assisted TAS experiments

To understand the origins of materials properties, neutron scattering ex...
research
04/23/2020

Learning Constrained Dynamics with Gauss Principle adhering Gaussian Processes

The identification of the constrained dynamics of mechanical systems is ...

Please sign up or login with your details

Forgot password? Click here to reset