Data-dependent compression of random features for large-scale kernel approximation

10/09/2018
by   Raj Agrawal, et al.
8

Kernel methods offer the flexibility to learn complex relationships in modern, large data sets while enjoying strong theoretical guarantees on quality. Unfortunately, these methods typically require cubic running time in the data set size, a prohibitive cost in the large-data setting. Random feature maps (RFMs) and the Nystrom method both consider low-rank approximations to the kernel matrix as a potential solution. But, in order to achieve desirable theoretical guarantees, the former may require a prohibitively large number of features J+, and the latter may be prohibitively expensive for high-dimensional problems. We propose to combine the simplicity and generality of RFMs with a data-dependent feature selection scheme to achieve desirable theoretical approximation properties of Nystrom with just O(log J+) features. Our key insight is to begin with a large set of random features, then reduce them to a small number of weighted features in a data-dependent, computationally efficient way, while preserving the statistical guarantees of using the original large set of features. We demonstrate the efficacy of our method with theory and experiments--including on a data set with over 50 million observations. In particular, we show that our method achieves small kernel matrix approximation error and better test set accuracy with provably fewer random features than state- of-the-art methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/23/2016

Efficient Feature Selection With Large and High-dimensional Data

Driven by the advances in technology, large and high-dimensional data ha...
research
11/20/2019

Random Fourier Features via Fast Surrogate Leverage Weighted Sampling

In this paper, we propose a fast surrogate leverage weighted sampling st...
research
05/17/2019

LR-GLM: High-Dimensional Bayesian Inference Using Low-Rank Data Approximations

Due to the ease of modern data collection, applied statisticians often h...
research
08/11/2020

Random Projections and Dimension Reduction

This paper, broadly speaking, covers the use of randomness in two main a...
research
01/23/2023

Sampling-based Nyström Approximation and Kernel Quadrature

We analyze the Nyström approximation of a positive definite kernel assoc...
research
09/14/2018

Revisiting Random Binning Features: Fast Convergence and Strong Parallelizability

Kernel method has been developed as one of the standard approaches for n...
research
03/12/2016

Laplacian Eigenmaps from Sparse, Noisy Similarity Measurements

Manifold learning and dimensionality reduction techniques are ubiquitous...

Please sign up or login with your details

Forgot password? Click here to reset