A Simple Algorithm For Scaling Up Kernel Methods

01/26/2023
by   Teng Andrea Xu, et al.
0

The recent discovery of the equivalence between infinitely wide neural networks (NNs) in the lazy training regime and Neural Tangent Kernels (NTKs) (Jacot et al., 2018) has revived interest in kernel methods. However, conventional wisdom suggests kernel methods are unsuitable for large samples due to their computational complexity and memory requirements. We introduce a novel random feature regression algorithm that allows us (when necessary) to scale to virtually infinite numbers of random features. We illustrate the performance of our method on the CIFAR-10 dataset.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset