Explicit Approximations of the Gaussian Kernel

09/21/2011
by   Andrew Cotter, et al.
0

We investigate training and using Gaussian kernel SVMs by approximating the kernel with an explicit finite- dimensional polynomial feature representation based on the Taylor expansion of the exponential. Although not as efficient as the recently-proposed random Fourier features [Rahimi and Recht, 2007] in terms of the number of features, we show how this polynomial representation can provide a better approximation in terms of the computational cost involved. This makes our "Taylor features" especially attractive for use on very large data sets, in conjunction with online or stochastic training.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/12/2022

Local Random Feature Approximations of the Gaussian Kernel

A fundamental drawback of kernel-based statistical models is their limit...
research
02/11/2020

Generalization Guarantees for Sparse Kernel Approximation with Entropic Optimal Features

Despite their success, kernel methods suffer from a massive computationa...
research
06/12/2020

Fourier Sparse Leverage Scores and Approximate Kernel Learning

We prove new explicit upper bounds on the leverage scores of Fourier spa...
research
06/06/2015

Optimal Rates for Random Fourier Features

Kernel methods represent one of the most powerful tools in machine learn...
research
01/05/2020

Exponential inequalities for dependent V-statistics via random Fourier features

We establish exponential inequalities for a class of V-statistics under ...
research
08/21/2021

Fast Sketching of Polynomial Kernels of Polynomial Degree

Kernel methods are fundamental in machine learning, and faster algorithm...
research
05/10/2018

Supervising Nyström Methods via Negative Margin Support Vector Selection

Pattern recognition on big data can be challenging for kernel machines a...

Please sign up or login with your details

Forgot password? Click here to reset