Random Gegenbauer Features for Scalable Kernel Methods

02/07/2022
by   Insu Han, et al.
0

We propose efficient random features for approximating a new and rich class of kernel functions that we refer to as Generalized Zonal Kernels (GZK). Our proposed GZK family, generalizes the zonal kernels (i.e., dot-product kernels on the unit sphere) by introducing radial factors in their Gegenbauer series expansion, and includes a wide range of ubiquitous kernel functions such as the entirety of dot-product kernels as well as the Gaussian and the recently introduced Neural Tangent kernels. Interestingly, by exploiting the reproducing property of the Gegenbauer polynomials, we can construct efficient random features for the GZK family based on randomly oriented Gegenbauer kernels. We prove subspace embedding guarantees for our Gegenbauer features which ensures that our features can be used for approximately solving learning problems such as kernel k-means clustering, kernel ridge regression, etc. Empirical results show that our proposed features outperform recent kernel approximation methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/13/2019

On the Expressive Power of Kernel Methods and the Efficiency of Kernel Learning by Association Schemes

We study the expressive power of kernel methods and the algorithmic feas...
research
05/06/2020

Strictly positive definite kernels on the 2-sphere: beyond radial symmetry

The paper introduces a new characterisation of strictly positive definit...
research
09/02/2020

Generalized vec trick for fast learning of pairwise kernel models

Pairwise learning corresponds to the supervised learning setting where t...
research
02/18/2014

The Random Forest Kernel and other kernels for big data from random partitions

We present Random Partition Kernels, a new class of kernels derived by d...
research
03/21/2020

Scaling up Kernel Ridge Regression via Locality Sensitive Hashing

Random binning features, introduced in the seminal paper of Rahimi and R...
research
10/11/2020

A kernel-independent sum-of-Gaussians method by de la Vallée-Poussin sums

Approximation of interacting kernels by sum of Gaussians (SOG) is freque...
research
01/21/2022

Improved Random Features for Dot Product Kernels

Dot product kernels, such as polynomial and exponential (softmax) kernel...

Please sign up or login with your details

Forgot password? Click here to reset