Gradient Kernel Regression

04/13/2021
by   Matt Calder, et al.
0

In this article a surprising result is demonstrated using the neural tangent kernel. This kernel is defined as the inner product of the vector of the gradient of an underlying model evaluated at training points. This kernel is used to perform kernel regression. The surprising thing is that the accuracy of that regression is independent of the accuracy of the underlying network.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro