A conditional one-output likelihood formulation for multitask Gaussian processes

06/05/2020
by   Vanessa Gómez-Verdejo, et al.
0

Multitask Gaussian processes (MTGP) are the Gaussian process (GP) framework's solution for multioutput regresion problems in which the T elements of the regressors cannot be considered conditionally independent given the observations. Standard MTGP models assume that there exist both a multitask covariance matrix as a function of an intertask matrix, and a noise covariance matrix. These matrices need to be approximated by a low rank simplification of order P in order to reduce the number of parameters to be learnt from T^2 to TP. Here we introduce a novel approach that simplifies the multitask learning by reducing it to a set of conditioned univariate GPs without the need for any low rank approximations, therefore completely eliminating the requirement to select an adequate value for hyperparameter P. At the same time, by extending this approach with either a hierarchical or approximate model, the proposed method is capable of recovering the multitask covariance and noise matrices after learning only 2T parameters, avoiding the validation of any model hyperparameter and reducing the overall complexity of the model as well as the risk of overfitting. Experimental results over synthetic and real problems confirm the advantages of this inference approach in its ability to accurately recover the original noise and signal matrices, as well as the achieved performance improvement in comparison to other state of art MTGP approaches. We have also integrated the model with standard GP toolboxes, showing that it is computationally competitive with other state of the art options.

READ FULL TEXT
research
02/27/2017

Embarrassingly parallel inference for Gaussian processes

Training Gaussian process (GP)-based models typically involves an O(N^3...
research
04/29/2021

MuyGPs: Scalable Gaussian Process Hyperparameter Estimation Using Local Cross-Validation

Gaussian processes (GPs) are non-linear probabilistic models popular in ...
research
08/09/2014

Parallel Gaussian Process Regression with Low-Rank Covariance Matrix Approximations

Gaussian processes (GP) are Bayesian non-parametric models that are wide...
research
02/02/2016

An analytic comparison of regularization methods for Gaussian Processes

Gaussian Processes (GPs) are a popular approach to predict the output of...
research
09/26/2013

Constrained Bayesian Inference for Low Rank Multitask Learning

We present a novel approach for constrained Bayesian inference. Unlike c...
research
02/11/2013

The trace norm constrained matrix-variate Gaussian process for multitask bipartite ranking

We propose a novel hierarchical model for multitask bipartite ranking. T...

Please sign up or login with your details

Forgot password? Click here to reset