Generalized Multi-Output Gaussian Process Censored Regression

by   Daniele Gammelli, et al.

When modelling censored observations, a typical approach in current regression methods is to use a censored-Gaussian (i.e. Tobit) model to describe the conditional output distribution. In this paper, as in the case of missing data, we argue that exploiting correlations between multiple outputs can enable models to better address the bias introduced by censored data. To do so, we introduce a heteroscedastic multi-output Gaussian process model which combines the non-parametric flexibility of GPs with the ability to leverage information from correlated outputs under input-dependent noise conditions. To address the resulting inference intractability, we further devise a variational bound to the marginal log-likelihood suitable for stochastic optimization. We empirically evaluate our model against other generative models for censored data on both synthetic and real world tasks and further show how it can be generalized to deal with arbitrary likelihood functions. Results show how the added flexibility allows our model to better estimate the underlying non-censored (i.e. true) process under potentially complex censoring dynamics.


page 1

page 2

page 3

page 4


Collaborative Nonstationary Multivariate Gaussian Process Model

Currently, multi-output Gaussian process regression models either do not...

Enhanced Particle Swarm Optimization Algorithms for Multiple-Input Multiple-Output System Modelling using Convolved Gaussian Process Models

Convolved Gaussian Process (CGP) is able to capture the correlations not...

Gaussian Process Regression Networks

We introduce a new regression framework, Gaussian process regression net...

Large Linear Multi-output Gaussian Process Learning

Gaussian processes (GPs), or distributions over arbitrary functions in a...

Recurrent Flow Networks: A Recurrent Latent Variable Model for Spatio-Temporal Density Modelling

When modelling real-valued sequences, a typical approach in current RNN ...

Learning Multi-Task Gaussian Process Over Heterogeneous Input Domains

Multi-task Gaussian process (MTGP) is a well-known non-parametric Bayesi...

Practical Conditional Neural Processes Via Tractable Dependent Predictions

Conditional Neural Processes (CNPs; Garnelo et al., 2018a) are meta-lear...