Is Solving Graph Neural Tangent Kernel Equivalent to Training Graph Neural Network?

09/14/2023
by   Lianke Qin, et al.
0

A rising trend in theoretical deep learning is to understand why deep learning works through Neural Tangent Kernel (NTK) [jgh18], a kernel method that is equivalent to using gradient descent to train a multi-layer infinitely-wide neural network. NTK is a major step forward in the theoretical deep learning because it allows researchers to use traditional mathematical tools to analyze properties of deep neural networks and to explain various neural network techniques from a theoretical view. A natural extension of NTK on graph learning is Graph Neural Tangent Kernel (GNTK), and researchers have already provide GNTK formulation for graph-level regression and show empirically that this kernel method can achieve similar accuracy as GNNs on various bioinformatics datasets [dhs+19]. The remaining question now is whether solving GNTK regression is equivalent to training an infinite-wide multi-layer GNN using gradient descent. In this paper, we provide three new theoretical results. First, we formally prove this equivalence for graph-level regression. Second, we present the first GNTK formulation for node-level regression. Finally, we prove the equivalence for node-level regression.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/04/2021

Fast Graph Neural Tangent Kernel via Kronecker Sketching

Many deep learning tasks have to deal with graphs (e.g., protein structu...
research
02/07/2020

Spectrum Dependent Learning Curves in Kernel Regression and Wide Neural Networks

A fundamental question in modern machine learning is how deep neural net...
research
03/03/2021

Wide Graph Neural Networks: Aggregation Provably Leads to Exponentially Trainability Loss

Graph convolutional networks (GCNs) and their variants have achieved gre...
research
05/09/2023

Deep Learning and Geometric Deep Learning: an introduction for mathematicians and physicists

In this expository paper we want to give a brief introduction, with few ...
research
06/25/2020

Fast Learning of Graph Neural Networks with Guaranteed Generalizability: One-hidden-layer Case

Although graph neural networks (GNNs) have made great progress recently ...
research
08/01/2023

An Exact Kernel Equivalence for Finite Classification Models

We explore the equivalence between neural networks and kernel methods by...
research
11/15/2017

Deep Epitome for Unravelling Generalized Hamming Network: A Fuzzy Logic Interpretation of Deep Learning

This paper gives a rigorous analysis of trained Generalized Hamming Netw...

Please sign up or login with your details

Forgot password? Click here to reset