Multiresolution Kernel Approximation for Gaussian Process Regression

08/07/2017
by   Yi Ding, et al.
0

Gaussian process regression generally does not scale to beyond a few thousands data points without applying some sort of kernel approximation method. Most approximations focus on the high eigenvalue part of the spectrum of the kernel matrix, K, which leads to bad performance when the length scale of the kernel is small. In this paper we introduce Multiresolution Kernel Approximation (MKA), the first true broad bandwidth kernel approximation algorithm. Important points about MKA are that it is memory efficient, and it is a direct method, which means that it also makes it easy to approximate K^-1 and det(K).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/04/2021

Gauss-Legendre Features for Gaussian Process Regression

Gaussian processes provide a powerful probabilistic kernel learning fram...
research
03/27/2018

Distributed Adaptive Sampling for Kernel Matrix Approximation

Most kernel-based methods, such as kernel or Gaussian process regression...
research
08/26/2016

A Randomized Approach to Efficient Kernel Clustering

Kernel-based K-means clustering has gained popularity due to its simplic...
research
07/21/2021

Online structural kernel selection for mobile health

Motivated by the need for efficient and personalized learning in mobile ...
research
04/12/2022

Local Random Feature Approximations of the Gaussian Kernel

A fundamental drawback of kernel-based statistical models is their limit...
research
03/11/2018

Improved Asymptotics for Zeros of Kernel Estimates via a Reformulation of the Leadbetter-Cryer Integral

The expected number of false inflection points of kernel smoothers is ev...
research
01/28/2019

On Random Subsampling of Gaussian Process Regression: A Graphon-Based Analysis

In this paper, we study random subsampling of Gaussian process regressio...

Please sign up or login with your details

Forgot password? Click here to reset