Gradient Kernel Regression

04/13/2021
by   Matt Calder, et al.
0

In this article a surprising result is demonstrated using the neural tangent kernel. This kernel is defined as the inner product of the vector of the gradient of an underlying model evaluated at training points. This kernel is used to perform kernel regression. The surprising thing is that the accuracy of that regression is independent of the accuracy of the underlying network.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/20/2023

Optimal Kernel for Kernel-Based Modal Statistical Methods

Kernel-based modal statistical methods include mode estimation, regressi...
research
09/08/2023

Optimal Rate of Kernel Regression in Large Dimensions

We perform a study on kernel regression for large-dimensional data (wher...
research
02/19/2020

Deep regularization and direct training of the inner layers of Neural Networks with Kernel Flows

We introduce a new regularization method for Artificial Neural Networks ...
research
05/21/2021

Properties of the After Kernel

The Neural Tangent Kernel (NTK) is the wide-network limit of a kernel de...
research
12/09/2020

On an Unknown Ancestor of Burrows' Delta Measure

This article points out some surprising similarities between a 1944 stud...
research
12/09/2020

Consistent regression of biophysical parameters with kernel methods

This paper introduces a novel statistical regression framework that allo...
research
09/15/2019

A brief TOGAF description using SEMAT Essence Kernel

This work aims to explore the possibility of describing the enterprise a...

Please sign up or login with your details

Forgot password? Click here to reset