A Simple Algorithm For Scaling Up Kernel Methods

01/26/2023
by   Teng Andrea Xu, et al.
0

The recent discovery of the equivalence between infinitely wide neural networks (NNs) in the lazy training regime and Neural Tangent Kernels (NTKs) (Jacot et al., 2018) has revived interest in kernel methods. However, conventional wisdom suggests kernel methods are unsuitable for large samples due to their computational complexity and memory requirements. We introduce a novel random feature regression algorithm that allows us (when necessary) to scale to virtually infinite numbers of random features. We illustrate the performance of our method on the CIFAR-10 dataset.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/15/2021

Scaling Neural Tangent Kernels via Sketching and Random Features

The Neural Tangent Kernel (NTK) characterizes the behavior of infinitely...
research
03/09/2023

Kernel Regression with Infinite-Width Neural Networks on Millions of Examples

Neural kernels have drastically increased performance on diverse and non...
research
02/06/2023

Toward Large Kernel Models

Recent studies indicate that kernel machines can often perform similarly...
research
02/19/2020

Deep regularization and direct training of the inner layers of Neural Networks with Kernel Flows

We introduce a new regularization method for Artificial Neural Networks ...
research
12/09/2019

Temporal Factorization of 3D Convolutional Kernels

3D convolutional neural networks are difficult to train because they are...
research
07/02/2019

Isolation Kernel: The X Factor in Efficient and Effective Large Scale Online Kernel Learning

Large scale online kernel learning aims to build an efficient and scalab...
research
05/25/2021

Structured Convolutional Kernel Networks for Airline Crew Scheduling

Motivated by the needs from an airline crew scheduling application, we i...

Please sign up or login with your details

Forgot password? Click here to reset