Oblivious Sketching of High-Degree Polynomial Kernels

09/03/2019
by   Michael Kapralov, et al.
0

Kernel methods are fundamental tools in machine learning that allow detection of non-linear dependencies between data without explicitly constructing feature vectors in high dimensional spaces. A major disadvantage of kernel methods is their poor scalability: primitives such as kernel PCA or kernel ridge regression generally take prohibitively large quadratic space and (at least) quadratic time, as kernel matrices are usually dense. Some methods for speeding up kernel linear algebra are known, but they all invariably take time exponential in either the dimension of the input point set (e.g., fast multipole methods suffer from the curse of dimensionality) or in the degree of the kernel function. Oblivious sketching has emerged as a powerful approach to speeding up numerical linear algebra over the past decade, but our understanding of oblivious sketching solutions for kernel matrices has remained quite limited, suffering from the aforementioned exponential dependence on input parameters. Our main contribution is a general method for applying sketching solutions developed in numerical linear algebra over the past decade to a tensoring of data points without forming the tensoring explicitly. This leads to the first oblivious sketch for the polynomial kernel with a target dimension that is only polynomially dependent on the degree of the kernel function, as well as the first oblivious sketch for the Gaussian kernel on bounded datasets that does not suffer from an exponential dependence on the dimensionality of input data points.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/08/2020

Near Input Sparsity Time Kernel Embeddings via Adaptive Sampling

To accelerate kernel methods, we propose a near input sparsity time algo...
research
08/21/2021

Fast Sketching of Polynomial Kernels of Polynomial Degree

Kernel methods are fundamental in machine learning, and faster algorithm...
research
04/22/2015

Spectral Norm of Random Kernel Matrices with Applications to Privacy

Kernel methods are an extremely popular set of techniques used for many ...
research
03/27/2018

A Study of Clustering Techniques and Hierarchical Matrix Formats for Kernel Ridge Regression

We present memory-efficient and scalable algorithms for kernel methods u...
research
06/02/2020

Construction of 'Support Vector' Machine Feature Spaces via Deformed Weyl-Heisenberg Algebra

This paper uses deformed coherent states, based on a deformed Weyl-Heise...
research
02/22/2019

Spatial Analysis Made Easy with Linear Regression and Kernels

Kernel methods are an incredibly popular technique for extending linear ...
research
03/16/2018

Sufficient Conditions for a Linear Estimator to be a Local Polynomial Regression

It is shown that any linear estimator that satisfies the moment conditio...

Please sign up or login with your details

Forgot password? Click here to reset