Enriched Mixtures of Gaussian Process Experts

05/30/2019
by   Charles W. L. Gadd, et al.
0

Mixtures of experts probabilistically divide the input space into regions, where the assumptions of each expert, or conditional model, need only hold locally. Combined with Gaussian process (GP) experts, this results in a powerful and highly flexible model. We focus on alternative mixtures of GP experts, which model the joint distribution of the inputs and targets explicitly. We highlight issues of this approach in multi-dimensional input spaces, namely, poor scalability and the need for an unnecessarily large number of experts, degrading the predictive performance and increasing uncertainty. We construct a novel model to address these issues through a nested partitioning scheme that automatically infers the number of components at both levels. Multiple response types are accommodated through a generalised GP framework, while multiple input types are included through a factorised exponential family structure. We show the effectiveness of our approach in estimating a parsimonious probabilistic description of both synthetic data of increasing dimension and an Alzheimer's challenge dataset.

READ FULL TEXT

page 6

page 21

page 25

page 26

page 27

research
11/24/2015

Transductive Log Opinion Pool of Gaussian Process Experts

We introduce a framework for analyzing transductive combination of Gauss...
research
10/28/2014

Generalized Product of Experts for Automatic and Principled Fusion of Gaussian Process Predictions

In this work, we propose a generalized product of experts (gPoE) framewo...
research
08/26/2022

Mixtures of Gaussian Process Experts with SMC^2

Gaussian processes are a key component of many flexible statistical and ...
research
04/26/2023

Mixtures of Gaussian process experts based on kernel stick-breaking processes

Mixtures of Gaussian process experts is a class of models that can simul...
research
02/09/2023

Gaussian Process-Gated Hierarchical Mixtures of Experts

In this paper, we propose novel Gaussian process-gated hierarchical mixt...
research
08/21/2020

Biased Mixtures Of Experts: Enabling Computer Vision Inference Under Data Transfer Limitations

We propose a novel mixture-of-experts class to optimize computer vision ...
research
10/14/2022

Bayesian Regularization on Function Spaces via Q-Exponential Process

Regularization is one of the most important topics in optimization, stat...

Please sign up or login with your details

Forgot password? Click here to reset