Low-Precision Random Fourier Features for Memory-Constrained Kernel Approximation

10/31/2018
by   Jian Zhang, et al.
4

We investigate how to train kernel approximation methods that generalize well under a memory budget. Building on recent theoretical work, we define a measure of kernel approximation error which we find to be much more predictive of the empirical generalization performance of kernel approximation methods than conventional metrics. An important consequence of this definition is that a kernel approximation matrix must be high-rank to attain close approximation. Because storing a high-rank approximation is memory-intensive, we propose using a low-precision quantization of random Fourier features (LP-RFFs) to build a high-rank approximation under a memory budget. Theoretically, we show quantization has a negligible effect on generalization performance in important settings. Empirically, we demonstrate across four benchmark datasets that LP-RFFs can match the performance of full-precision RFFs and the Nyström method, with 3x-10x and 50x-460x less memory, respectively.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/04/2021

Sigma-Delta and Distributed Noise-Shaping Quantization Methods for Random Fourier Features

We propose the use of low bit-depth Sigma-Delta and distributed noise-sh...
research
01/04/2021

Gauss-Legendre Features for Gaussian Process Regression

Gaussian processes provide a powerful probabilistic kernel learning fram...
research
05/23/2017

Data-driven Random Fourier Features using Stein Effect

Large-scale kernel approximation is an important problem in machine lear...
research
12/07/2017

Learning Random Fourier Features by Hybrid Constrained Optimization

The kernel embedding algorithm is an important component for adapting ke...
research
10/28/2016

Orthogonal Random Features

We present an intriguing discovery related to Random Fourier Features: i...
research
01/28/2016

Large-scale Kernel-based Feature Extraction via Budgeted Nonlinear Subspace Tracking

Kernel-based methods enjoy powerful generalization capabilities in handl...
research
02/11/2020

Generalization Guarantees for Sparse Kernel Approximation with Entropic Optimal Features

Despite their success, kernel methods suffer from a massive computationa...

Please sign up or login with your details

Forgot password? Click here to reset