A la Carte - Learning Fast Kernels

12/19/2014
by   Zichao Yang, et al.
0

Kernel methods have great promise for learning rich statistical representations of large modern datasets. However, compared to neural networks, kernel methods have been perceived as lacking in scalability and flexibility. We introduce a family of fast, flexible, lightly parametrized and general purpose kernel learning methods, derived from Fastfood basis function expansions. We provide mechanisms to learn the properties of groups of spectral frequencies in these expansions, which require only O(mlogd) time and O(m) memory, for m basis functions and d input dimensions. We show that the proposed methods can learn a wide class of kernels, outperforming the alternatives in accuracy, speed, and memory consumption.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/13/2019

On the Expressive Power of Kernel Methods and the Efficiency of Kernel Learning by Association Schemes

We study the expressive power of kernel methods and the algorithmic feas...
research
06/07/2015

Generalized Spectral Kernels

In this paper we propose a family of tractable kernels that is dense in ...
research
05/06/2023

Approximation by non-symmetric networks for cross-domain learning

For the past 30 years or so, machine learning has stimulated a great dea...
research
10/07/2019

Deep Kernel Learning via Random Fourier Features

Kernel learning methods are among the most effective learning methods an...
research
08/30/2020

Performance portability through machine learning guided kernel selection in SYCL libraries

Automatically tuning parallel compute kernels allows libraries and frame...
research
06/21/2023

Constant Memory Attention Block

Modern foundation model architectures rely on attention mechanisms to ef...
research
06/04/2022

Data-driven Construction of Hierarchical Matrices with Nested Bases

Hierarchical matrices provide a powerful representation for significantl...

Please sign up or login with your details

Forgot password? Click here to reset