Fully Scalable Gaussian Processes using Subspace Inducing Inputs

07/06/2018
by   Aristeidis Panos, et al.
0

We introduce fully scalable Gaussian processes, an implementation scheme that tackles the problem of treating a high number of training instances together with high dimensional input data. Our key idea is a representation trick over the inducing variables called subspace inducing inputs. This is combined with certain matrix-preconditioning based parametrizations of the variational distributions that lead to simplified and numerically stable variational lower bounds. Our illustrative applications are based on challenging extreme multi-label classification problems with the extra burden of the very large number of class labels. We demonstrate the usefulness of our approach by presenting predictive performances together with low computational times in datasets with extremely large number of instances and input dimensions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/23/2019

Sparse Orthogonal Variational Inference for Gaussian Processes

We introduce a new interpretation of sparse variational approximations f...
research
11/10/2020

Sparse within Sparse Gaussian Processes using Neighbor Information

Approximations to Gaussian processes based on inducing variables, combin...
research
03/06/2020

Rethinking Sparse Gaussian Processes: Bayesian Approaches to Inducing-Variable Approximations

Variational inference techniques based on inducing variables provide an ...
research
06/10/2021

Scalable Variational Gaussian Processes via Harmonic Kernel Decomposition

We introduce a new scalable variational Gaussian process approximation w...
research
09/21/2021

Bayes Linear Emulation of Simulated Crop Yield

The analysis of the output from a large scale computer simulation experi...
research
07/08/2021

Scaling Gaussian Processes with Derivative Information Using Variational Inference

Gaussian processes with derivative information are useful in many settin...
research
05/22/2018

Variational Learning on Aggregate Outputs with Gaussian Processes

While a typical supervised learning framework assumes that the inputs an...

Please sign up or login with your details

Forgot password? Click here to reset