Data-driven Random Fourier Features using Stein Effect

05/23/2017
by   Wei-Cheng Chang, et al.
0

Large-scale kernel approximation is an important problem in machine learning research. Approaches using random Fourier features have become increasingly popular [Rahimi and Recht, 2007], where kernel approximation is treated as empirical mean estimation via Monte Carlo (MC) or Quasi-Monte Carlo (QMC) integration [Yang et al., 2014]. A limitation of the current approaches is that all the features receive an equal weight summing to 1. In this paper, we propose a novel shrinkage estimator from "Stein effect", which provides a data-driven weighting strategy for random features and enjoys theoretical justifications in terms of lowering the empirical risk. We further present an efficient randomized algorithm for large-scale applications of the proposed method. Our empirical results on six benchmark data sets demonstrate the advantageous performance of this approach over representative baselines in both kernel approximation and supervised learning tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/27/2017

Data Dependent Kernel Approximation using Pseudo Random Fourier Features

Kernel methods are powerful and flexible approach to solve many problems...
research
12/29/2014

Quasi-Monte Carlo Feature Maps for Shift-Invariant Kernels

We consider the problem of improving the efficiency of randomized Fourie...
research
02/11/2018

Quadrature-based features for kernel approximation

We consider the problem of improving kernel approximation via randomized...
research
04/23/2020

Random Features for Kernel Approximation: A Survey in Algorithms, Theory, and Beyond

Random features is one of the most sought-after research topics in stati...
research
03/20/2019

On Sampling Random Features From Empirical Leverage Scores: Implementation and Theoretical Guarantees

Random features provide a practical framework for large-scale kernel app...
research
10/31/2018

Low-Precision Random Fourier Features for Memory-Constrained Kernel Approximation

We investigate how to train kernel approximation methods that generalize...
research
12/19/2017

On Data-Dependent Random Features for Improved Generalization in Supervised Learning

The randomized-feature approach has been successfully employed in large-...

Please sign up or login with your details

Forgot password? Click here to reset