Learning with Neural Tangent Kernels in Near Input Sparsity Time

04/01/2021
by   Amir Zandieh, et al.
0

The Neural Tangent Kernel (NTK) characterizes the behavior of infinitely wide neural nets trained under least squares loss by gradient descent (Jacot et al., 2018). However, despite its importance, the super-quadratic runtime of kernel methods limits the use of NTK in large-scale learning tasks. To accelerate kernel machines with NTK, we propose a near input sparsity time algorithm that maps the input data to a randomized low-dimensional feature space so that the inner product of the transformed data approximates their NTK evaluation. Furthermore, we propose a feature map for approximating the convolutional counterpart of the NTK (Arora et al., 2019), which can transform any image using a runtime that is only linear in the number of pixels. We show that in standard large-scale regression and classification tasks a linear regressor trained on our features outperforms trained NNs and Nystrom method with NTK kernels.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/15/2021

Scaling Neural Tangent Kernels via Sketching and Random Features

The Neural Tangent Kernel (NTK) characterizes the behavior of infinitely...
research
07/08/2020

Near Input Sparsity Time Kernel Embeddings via Adaptive Sampling

To accelerate kernel methods, we propose a near input sparsity time algo...
research
10/03/2019

Harnessing the Power of Infinitely Wide Deep Nets on Small-data Tasks

Recent research shows that the following two models are equivalent: (a) ...
research
09/09/2022

Fast Neural Kernel Embeddings for General Activations

Infinite width limit has shed light on generalization and optimization a...
research
02/09/2022

Leverage Score Sampling for Tensor Product Matrices in Input Sparsity Time

We give an input sparsity time sampling algorithm for spectrally approxi...
research
05/25/2023

Fast Online Node Labeling for Very Large Graphs

This paper studies the online node classification problem under a transd...
research
04/30/2018

Learning Explicit Deep Representations from Deep Kernel Networks

Deep kernel learning aims at designing nonlinear combinations of multipl...

Please sign up or login with your details

Forgot password? Click here to reset