Quadrature-based features for kernel approximation

02/11/2018
by   Marina Munkhoeva, et al.
0

We consider the problem of improving kernel approximation via randomized feature maps. These maps arise as Monte Carlo approximation to integral representations of kernel functions and scale up kernel methods for larger datasets. We propose to use more efficient numerical integration technique to obtain better estimates of the integrals compared to the state-of-the-art methods. Our approach allows the use of information about the integrand to enhance approximation and facilitates fast computations. We derive the convergence behavior and conduct an extensive empirical study that supports our hypothesis.

READ FULL TEXT
research
12/29/2014

Quasi-Monte Carlo Feature Maps for Shift-Invariant Kernels

We consider the problem of improving the efficiency of randomized Fourie...
research
05/23/2017

Data-driven Random Fourier Features using Stein Effect

Large-scale kernel approximation is an important problem in machine lear...
research
12/17/2013

Compact Random Feature Maps

Kernel approximation using randomized feature maps has recently gained a...
research
03/12/2015

Compact Nonlinear Maps and Circulant Extensions

Kernel approximation via nonlinear random feature maps is widely used in...
research
02/02/2019

Numerical Integration Method for Training Neural Network

We propose a new numerical integration method for training a shallow neu...
research
09/23/2019

Scalable Kernel Learning via the Discriminant Information

Kernel approximation methods have been popular techniques for scalable k...
research
02/11/2020

Generalization Guarantees for Sparse Kernel Approximation with Entropic Optimal Features

Despite their success, kernel methods suffer from a massive computationa...

Please sign up or login with your details

Forgot password? Click here to reset