Graph Convolutional Neural Networks with Diverse Negative Samples via Decomposed Determinant Point Processes

12/05/2022
by   Wei Duan, et al.
0

Graph convolutional networks (GCNs) have achieved great success in graph representation learning by extracting high-level features from nodes and their topology. Since GCNs generally follow a message-passing mechanism, each node aggregates information from its first-order neighbour to update its representation. As a result, the representations of nodes with edges between them should be positively correlated and thus can be considered positive samples. However, there are more non-neighbour nodes in the whole graph, which provide diverse and useful information for the representation update. Two non-adjacent nodes usually have different representations, which can be seen as negative samples. Besides the node representations, the structural information of the graph is also crucial for learning. In this paper, we used quality-diversity decomposition in determinant point processes (DPP) to obtain diverse negative samples. When defining a distribution on diverse subsets of all non-neighbouring nodes, we incorporate both graph structure information and node representations. Since the DPP sampling process requires matrix eigenvalue decomposition, we propose a new shortest-path-base method to improve computational efficiency. Finally, we incorporate the obtained negative samples into the graph convolution operation. The ideas are evaluated empirically in experiments on node classification tasks. These experiments show that the newly proposed methods not only improve the overall performance of standard representation learning but also significantly alleviate over-smoothing problems.

READ FULL TEXT
research
10/03/2022

Learning from the Dark: Boosting Graph Convolutional Neural Networks with Diverse Negative Samples

Graph Convolutional Neural Networks (GCNs) has been generally accepted t...
research
06/30/2021

Edge Representation Learning with Hypergraphs

Graph neural networks have recently achieved remarkable success in repre...
research
02/17/2023

Building Shortcuts between Distant Nodes with Biaffine Mapping for Graph Convolutional Networks

Multiple recent studies show a paradox in graph convolutional networks (...
research
08/28/2020

Graph Convolutional Neural Networks with Node Transition Probability-based Message Passing and DropNode Regularization

Graph convolutional neural networks (GCNNs) have received much attention...
research
05/20/2020

Understanding Negative Sampling in Graph Representation Learning

Graph representation learning has been extensively studied in recent yea...
research
03/11/2022

PathSAGE: Spatial Graph Attention Neural Networks With Random Path Sampling

Graph Convolutional Networks (GCNs) achieve great success in non-Euclide...
research
02/14/2021

Relation-aware Graph Attention Model With Adaptive Self-adversarial Training

This paper describes an end-to-end solution for the relationship predict...

Please sign up or login with your details

Forgot password? Click here to reset