Large-scale Heteroscedastic Regression via Gaussian Process

11/03/2018
by   Haitao Liu, et al.
0

Heteroscedastic regression which considers varying noises across input domain has many applications in fields like machine learning and statistics. Here we focus on the heteroscedastic Gaussian process (HGP) regression which integrates the latent function and the noise together in a unified non-parametric Bayesian framework. Though showing flexible and powerful performance, HGP suffers from the cubic time complexity, which strictly limits its application to big data. To improve the scalability of HGP, we first develop a variational sparse inference algorithm, named VSHGP, to handle large-scale datasets. Furthermore, to enhance the model capability of capturing quick-varying features, we follow the Bayesian committee machine (BCM) formalism to distribute the learning over multiple local VSHGP experts with many inducing points, and aggregate their predictive distributions. The proposed distributed VSHGP (DVSHGP) (i) enables large-scale HGP regression via distributed computations, and (ii) achieves high model capability via localized experts and many inducing points. Superiority of the proposed DVSHGP as compared to existing large-scale heteroscedastic/homoscedastic GPs is then verified using a synthetic dataset and three real-world datasets.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset