An Empirical Analysis of the Laplace and Neural Tangent Kernels

The neural tangent kernel is a kernel function defined over the parameter distribution of an infinite width neural network. Despite the impracticality of this limit, the neural tangent kernel has allowed for a more direct study of neural networks and a gaze through the veil of their black box. More recently, it has been shown theoretically that the Laplace kernel and neural tangent kernel share the same reproducing kernel Hilbert space in the space of 𝕊^d-1 alluding to their equivalence. In this work, we analyze the practical equivalence of the two kernels. We first do so by matching the kernels exactly and then by matching posteriors of a Gaussian process. Moreover, we analyze the kernels in ℝ^d and experiment with them in the task of regression.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/08/2022

Deep Maxout Network Gaussian Process

Study of neural networks with infinite width is important for better und...
research
07/03/2020

On the Similarity between the Laplace and Neural Tangent Kernels

Recent theoretical work has shown that massively overparameterized neura...
research
01/12/2022

On neural network kernels and the storage capacity problem

In this short note, we reify the connection between work on the storage ...
research
08/20/2015

Steps Toward Deep Kernel Methods from Infinite Neural Networks

Contemporary deep neural networks exhibit impressive results on practica...
research
04/30/2022

NeuralEF: Deconstructing Kernels by Deep Neural Networks

Learning the principal eigenfunctions of an integral operator defined by...
research
06/13/2022

Why Quantization Improves Generalization: NTK of Binary Weight Neural Networks

Quantized neural networks have drawn a lot of attention as they reduce t...
research
04/21/2022

Provably Efficient Kernelized Q-Learning

We propose and analyze a kernelized version of Q-learning. Although a ke...

Please sign up or login with your details

Forgot password? Click here to reset