Recycling Randomness with Structure for Sublinear time Kernel Expansions

05/29/2016
by   Krzysztof Choromanski, et al.
0

We propose a scheme for recycling Gaussian random vectors into structured matrices to approximate various kernel functions in sublinear time via random embeddings. Our framework includes the Fastfood construction as a special case, but also extends to Circulant, Toeplitz and Hankel matrices, and the broader family of structured matrices that are characterized by the concept of low-displacement rank. We introduce notions of coherence and graph-theoretic structural constants that control the approximation quality, and prove unbiasedness and low-variance properties of random feature maps that arise within our framework. For the case of low-displacement matrices, we show how the degree of structure and randomness can be controlled to reduce statistical variance at the cost of increased computation and storage requirements. Empirical results strongly support our theory and justify the use of a broader family of structured matrices for scaling up kernel methods using random features.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/28/2016

Orthogonal Random Features

We present an intriguing discovery related to Random Fourier Features: i...
research
12/17/2013

Compact Random Feature Maps

Kernel approximation using randomized feature maps has recently gained a...
research
04/25/2016

Fast nonlinear embeddings via structured matrices

We present a new paradigm for speeding up randomized computations of sev...
research
03/02/2017

The Unreasonable Effectiveness of Structured Random Orthogonal Embeddings

We examine a class of embeddings based on structured random matrices wit...
research
09/19/2018

Sublinear Time Low-Rank Approximation of Distance Matrices

Let P={ p_1, p_2, ... p_n } and Q = { q_1, q_2 ... q_m } be two point se...
research
12/17/2021

Sublinear Time Approximation of Text Similarity Matrices

We study algorithms for approximating pairwise similarity matrices that ...
research
05/29/2016

TripleSpin - a generic compact paradigm for fast machine learning computations

We present a generic compact computational framework relying on structur...

Please sign up or login with your details

Forgot password? Click here to reset