Graph Neural Networks Provably Benefit from Structural Information: A Feature Learning Perspective

06/24/2023
by   Wei Huang, et al.
0

Graph neural networks (GNNs) have pioneered advancements in graph representation learning, exhibiting superior feature learning and performance over multilayer perceptrons (MLPs) when handling graph inputs. However, understanding the feature learning aspect of GNNs is still in its initial stage. This study aims to bridge this gap by investigating the role of graph convolution within the context of feature learning theory in neural networks using gradient descent training. We provide a distinct characterization of signal learning and noise memorization in two-layer graph convolutional networks (GCNs), contrasting them with two-layer convolutional neural networks (CNNs). Our findings reveal that graph convolution significantly augments the benign overfitting regime over the counterpart CNNs, where signal learning surpasses noise memorization, by approximately factor √(D)^q-2, with D denoting a node's expected degree and q being the power of the ReLU activation function where q > 2. These findings highlight a substantial discrepancy between GNNs and MLPs in terms of feature learning and generalization capacity after gradient descent training, a conclusion further substantiated by our empirical simulations.

READ FULL TEXT

page 10

page 32

research
12/07/2020

Learning Graph Neural Networks with Approximate Gradient Descent

The first provably efficient algorithm for learning graph neural network...
research
01/06/2021

Node2Seq: Towards Trainable Convolutions in Graph Neural Networks

Investigating graph feature learning becomes essentially important with ...
research
05/14/2023

Towards Understanding the Generalization of Graph Neural Networks

Graph neural networks (GNNs) are the most widely adopted model in graph-...
research
02/15/2022

Random Feature Amplification: Feature Learning and Generalization in Neural Networks

In this work, we provide a characterization of the feature-learning proc...
research
05/29/2023

Learning Two-Layer Neural Networks, One (Giant) Step at a Time

We study the training dynamics of shallow neural networks, investigating...
research
05/18/2020

Hybrid-DNNs: Hybrid Deep Neural Networks for Mixed Inputs

Rapid development of big data and high-performance computing have encour...
research
06/10/2021

Separation Results between Fixed-Kernel and Feature-Learning Probability Metrics

Several works in implicit and explicit generative modeling empirically o...

Please sign up or login with your details

Forgot password? Click here to reset