A Unified Analysis of Random Fourier Features

06/24/2018
by   Zhu Li, et al.
0

We provide the first unified theoretical analysis of supervised learning with random Fourier features, covering different types of loss functions characteristic to kernel methods developed for this setting. More specifically, we investigate learning with squared error and Lipschitz continuous loss functions and give the sharpest expected risk convergence rates for problems in which random Fourier features are sampled either using the spectral measure corresponding to a shift-invariant kernel or the ridge leverage score function proposed in avron2017random. The trade-off between the number of features and the expected risk convergence rate is expressed in terms of the regularization parameter and the effective dimension of the problem. While the former can effectively capture the complexity of the target hypothesis, the latter is known for expressing the fine structure of the kernel with respect to the marginal distribution of a data generating process caponnetto2007optimal. In addition to our theoretical results, we propose an approximate leverage score sampler for large scale problems and show that it can be significantly more effective than the spectral measure sampler.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/13/2019

Exponential Convergence Rates of Classification Errors on Learning with SGD and Random Features

Although kernel methods are widely used in many learning problems, they ...
research
10/27/2017

The Error Probability of Random Fourier Features is Dimensionality Independent

We show that the error probability of reconstructing kernel matrices fro...
research
09/22/2021

Sharp Analysis of Random Fourier Features in Classification

We study the theoretical properties of random Fourier features classific...
research
03/09/2020

Theoretical Analysis of Divide-and-Conquer ERM: Beyond Square Loss and RKHS

Theoretical analysis of the divide-and-conquer based distributed learnin...
research
09/03/2021

Large-Scale Learning with Fourier Features and Tensor Decompositions

Random Fourier features provide a way to tackle large-scale machine lear...
research
05/23/2023

On the Size and Approximation Error of Distilled Sets

Dataset Distillation is the task of synthesizing small datasets from lar...
research
06/06/2015

Optimal Rates for Random Fourier Features

Kernel methods represent one of the most powerful tools in machine learn...

Please sign up or login with your details

Forgot password? Click here to reset