Gauss-Legendre Features for Gaussian Process Regression

01/04/2021
by   Paz Fink Shustin, et al.
0

Gaussian processes provide a powerful probabilistic kernel learning framework, which allows learning high quality nonparametric regression models via methods such as Gaussian process regression. Nevertheless, the learning phase of Gaussian process regression requires massive computations which are not realistic for large datasets. In this paper, we present a Gauss-Legendre quadrature based approach for scaling up Gaussian process regression via a low rank approximation of the kernel matrix. We utilize the structure of the low rank approximation to achieve effective hyperparameter learning, training and prediction. Our method is very much inspired by the well-known random Fourier features approach, which also builds low-rank approximations via numerical integration. However, our method is capable of generating high quality approximation to the kernel using an amount of features which is poly-logarithmic in the number of training points, while similar guarantees will require an amount that is at the very least linear in the number of training points when random Fourier features. Furthermore, the structure of the low-rank approximation that our method builds is subtly different from the one generated by random Fourier features, and this enables much more efficient hyperparameter learning. The utility of our method for learning with low-dimensional datasets is demonstrated using numerical experiments.

READ FULL TEXT
research
12/13/2021

How Good are Low-Rank Approximations in Gaussian Process Regression?

We provide guarantees for approximate Gaussian Process (GP) regression r...
research
11/14/2019

Conjugate Gradients for Kernel Machines

Regularized least-squares (kernel-ridge / Gaussian process) regression i...
research
08/07/2017

Multiresolution Kernel Approximation for Gaussian Process Regression

Gaussian process regression generally does not scale to beyond a few tho...
research
01/28/2019

On Random Subsampling of Gaussian Process Regression: A Graphon-Based Analysis

In this paper, we study random subsampling of Gaussian process regressio...
research
02/26/2016

Multivariate Hawkes Processes for Large-scale Inference

In this paper, we present a framework for fitting multivariate Hawkes pr...
research
03/21/2020

Scaling up Kernel Ridge Regression via Locality Sensitive Hashing

Random binning features, introduced in the seminal paper of Rahimi and R...
research
10/31/2018

Low-Precision Random Fourier Features for Memory-Constrained Kernel Approximation

We investigate how to train kernel approximation methods that generalize...

Please sign up or login with your details

Forgot password? Click here to reset