Improved Convergence Rates for Sparse Approximation Methods in Kernel-Based Learning

02/08/2022
by   Sattar Vakili, et al.
0

Kernel-based models such as kernel ridge regression and Gaussian processes are ubiquitous in machine learning applications for regression and optimization. It is well known that a serious downside for kernel-based models is the high computational cost; given a dataset of n samples, the cost grows as 𝒪(n^3). Existing sparse approximation methods can yield a significant reduction in the computational cost, effectively reducing the real world cost down to as low as 𝒪(n) in certain cases. Despite this remarkable empirical success, significant gaps remain in the existing results for the analytical confidence bounds on the error due to approximation. In this work, we provide novel confidence intervals for the Nyström method and the sparse variational Gaussian processes approximation method. Our confidence intervals lead to improved error bounds in both regression and optimization. We establish these confidence intervals using novel interpretations of the approximate (surrogate) posterior variance of the models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/27/2018

Higher-order approximate confidence intervals

We derive accurate confidence intervals based on higher-order approximat...
research
12/07/2020

Mapping Leaf Area Index with a Smartphone and Gaussian Processes

Leaf area index (LAI) is a key biophysical parameter used to determine f...
research
08/01/2020

Convergence of Sparse Variational Inference in Gaussian Processes Regression

Gaussian processes are distributions over functions that are versatile a...
research
02/21/2018

VBALD - Variational Bayesian Approximation of Log Determinants

Evaluating the log determinant of a positive definite matrix is ubiquito...
research
03/16/2023

Error analysis of regularized trigonometric linear regression with unbounded sampling: a statistical learning viewpoint

The effectiveness of non-parametric, kernel-based methods for function e...
research
02/22/2021

Debiased Kernel Methods

I propose a practical procedure based on bias correction and sample spli...
research
10/28/2021

Open Problem: Tight Online Confidence Intervals for RKHS Elements

Confidence intervals are a crucial building block in the analysis of var...

Please sign up or login with your details

Forgot password? Click here to reset