Mixtures of Gaussian Processes for regression under multiple prior distributions

by   Sarem Seitz, et al.

When constructing a Bayesian Machine Learning model, we might be faced with multiple different prior distributions and thus are required to properly consider them in a sensible manner in our model. While this situation is reasonably well explored for classical Bayesian Statistics, it appears useful to develop a corresponding method for complex Machine Learning problems. Given their underlying Bayesian framework and their widespread popularity, Gaussian Processes are a good candidate to tackle this task. We therefore extend the idea of Mixture models for Gaussian Process regression in order to work with multiple prior beliefs at once - both a analytical regression formula and a Sparse Variational approach are considered. In addition, we consider the usage of our approach to additionally account for the problem of prior misspecification in functional regression problems.


Random Feature Expansions for Deep Gaussian Processes

The composition of multiple Gaussian Processes as a Deep Gaussian Proces...

Gaussian Process Regression Model for Distribution Inputs

Monge-Kantorovich distances, otherwise known as Wasserstein distances, h...

Neural Diffusion Processes

Gaussian processes provide an elegant framework for specifying prior and...

Multimodal Deep Gaussian Processes

We propose a novel Bayesian approach to modelling multimodal data genera...

Gaussian Processes with Errors in Variables: Theory and Computation

Covariate measurement error in nonparametric regression is a common prob...

Bayesian Few-Shot Classification with One-vs-Each Pólya-Gamma Augmented Gaussian Processes

Few-shot classification (FSC), the task of adapting a classifier to unse...

Self-explaining variational posterior distributions for Gaussian Process models

Bayesian methods have become a popular way to incorporate prior knowledg...