Gaussian Process Regression and Classification under Mathematical Constraints with Learning Guarantees

04/21/2019
by   Jeremiah Zhe Liu, et al.
0

We introduce constrained Gaussian process (CGP), a Gaussian process model for random functions that allows easy placement of mathematical constrains (e.g., non-negativity, monotonicity, etc) on its sample functions. CGP comes with closed-form probability density function (PDF), and has the attractive feature that its posterior distributions for regression and classification are again CGPs with closed-form expressions. Furthermore, we show that CGP inherents the optimal theoretical properties of the Gaussian process, e.g. rates of posterior contraction, due to the fact that CGP is an Gaussian process with a more efficient model space.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/16/2022

On the inability of Gaussian process regression to optimally learn compositional functions

We rigorously prove that deep Gaussian process priors can outperform Gau...
research
12/15/2017

Safe Policy Search with Gaussian Process Models

We propose a method to optimise the parameters of a policy which will be...
research
10/28/2018

Bounded Regression with Gaussian Process Projection

Examples with bound information on the regression function and density a...
research
09/13/2018

Gaussian process classification using posterior linearisation

This paper proposes a new algorithm for Gaussian process classification ...
research
08/29/2023

Multi-Response Heteroscedastic Gaussian Process Models and Their Inference

Despite the widespread utilization of Gaussian process models for versat...
research
10/12/2019

Adaptive design for Gaussian process regression under censoring

A key objective in engineering problems is to predict an unknown experim...
research
05/25/2023

Privacy-aware Gaussian Process Regression

We propose the first theoretical and methodological framework for Gaussi...

Please sign up or login with your details

Forgot password? Click here to reset