Improved Convergence Rates for Sparse Approximation Methods in Kernel-Based Learning

02/08/2022
by   Sattar Vakili, et al.
0

Kernel-based models such as kernel ridge regression and Gaussian processes are ubiquitous in machine learning applications for regression and optimization. It is well known that a serious downside for kernel-based models is the high computational cost; given a dataset of n samples, the cost grows as 𝒪(n^3). Existing sparse approximation methods can yield a significant reduction in the computational cost, effectively reducing the real world cost down to as low as 𝒪(n) in certain cases. Despite this remarkable empirical success, significant gaps remain in the existing results for the analytical confidence bounds on the error due to approximation. In this work, we provide novel confidence intervals for the Nyström method and the sparse variational Gaussian processes approximation method. Our confidence intervals lead to improved error bounds in both regression and optimization. We establish these confidence intervals using novel interpretations of the approximate (surrogate) posterior variance of the models.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset