Graph Convolutional Neural Networks with Diverse Negative Samples via Decomposed Determinant Point Processes

by   Wei Duan, et al.

Graph convolutional networks (GCNs) have achieved great success in graph representation learning by extracting high-level features from nodes and their topology. Since GCNs generally follow a message-passing mechanism, each node aggregates information from its first-order neighbour to update its representation. As a result, the representations of nodes with edges between them should be positively correlated and thus can be considered positive samples. However, there are more non-neighbour nodes in the whole graph, which provide diverse and useful information for the representation update. Two non-adjacent nodes usually have different representations, which can be seen as negative samples. Besides the node representations, the structural information of the graph is also crucial for learning. In this paper, we used quality-diversity decomposition in determinant point processes (DPP) to obtain diverse negative samples. When defining a distribution on diverse subsets of all non-neighbouring nodes, we incorporate both graph structure information and node representations. Since the DPP sampling process requires matrix eigenvalue decomposition, we propose a new shortest-path-base method to improve computational efficiency. Finally, we incorporate the obtained negative samples into the graph convolution operation. The ideas are evaluated empirically in experiments on node classification tasks. These experiments show that the newly proposed methods not only improve the overall performance of standard representation learning but also significantly alleviate over-smoothing problems.


Learning from the Dark: Boosting Graph Convolutional Neural Networks with Diverse Negative Samples

Graph Convolutional Neural Networks (GCNs) has been generally accepted t...

Edge Representation Learning with Hypergraphs

Graph neural networks have recently achieved remarkable success in repre...

Building Shortcuts between Distant Nodes with Biaffine Mapping for Graph Convolutional Networks

Multiple recent studies show a paradox in graph convolutional networks (...

Graph Convolutional Neural Networks with Node Transition Probability-based Message Passing and DropNode Regularization

Graph convolutional neural networks (GCNNs) have received much attention...

Understanding Negative Sampling in Graph Representation Learning

Graph representation learning has been extensively studied in recent yea...

PathSAGE: Spatial Graph Attention Neural Networks With Random Path Sampling

Graph Convolutional Networks (GCNs) achieve great success in non-Euclide...

Relation-aware Graph Attention Model With Adaptive Self-adversarial Training

This paper describes an end-to-end solution for the relationship predict...

Please sign up or login with your details

Forgot password? Click here to reset