Self-Contrastive Graph Diffusion Network

07/27/2023
by   Yixian Ma, et al.
0

Augmentation techniques and sampling strategies are crucial in contrastive learning, but in most existing works, augmentation techniques require careful design, and their sampling strategies can only capture a small amount of intrinsic supervision information. Additionally, the existing methods require complex designs to obtain two different representations of the data. To overcome these limitations, we propose a novel framework called the Self-Contrastive Graph Diffusion Network (SCGDN). Our framework consists of two main components: the Attentional Module (AttM) and the Diffusion Module (DiFM). AttM aggregates higher-order structure and feature information to get an excellent embedding, while DiFM balances the state of each node in the graph through Laplacian diffusion learning and allows the cooperative evolution of adjacency and feature information in the graph. Unlike existing methodologies, SCGDN is an augmentation-free approach that avoids "sampling bias" and semantic drift, without the need for pre-training. We conduct a high-quality sampling of samples based on structure and feature information. If two nodes are neighbors, they are considered positive samples of each other. If two disconnected nodes are also unrelated on kNN graph, they are considered negative samples for each other. The contrastive objective reasonably uses our proposed sampling strategies, and the redundancy reduction term minimizes redundant information in the embedding and can well retain more discriminative information. In this novel framework, the graph self-contrastive learning paradigm gives expression to a powerful force. SCGDN effectively balances between preserving high-order structure information and avoiding overfitting. The results manifest that SCGDN can consistently generate outperformance over both the contrastive methods and the classical methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/15/2022

ARIEL: Adversarial Graph Contrastive Learning

Contrastive learning is an effective unsupervised method in graph repres...
research
06/06/2023

Subgraph Networks Based Contrastive Learning

Graph contrastive learning (GCL), as a self-supervised learning method, ...
research
12/13/2022

Coarse-to-Fine Contrastive Learning on Graphs

Inspired by the impressive success of contrastive learning (CL), a varie...
research
10/01/2022

Heterogeneous Graph Contrastive Multi-view Learning

Inspired by the success of contrastive learning (CL) in computer vision ...
research
02/06/2023

Spectral Augmentations for Graph Contrastive Learning

Contrastive learning has emerged as a premier method for learning repres...
research
09/28/2022

Graph Soft-Contrastive Learning via Neighborhood Ranking

Graph contrastive learning (GCL) has been an emerging solution for graph...
research
10/06/2022

Adversarial Lagrangian Integrated Contrastive Embedding for Limited Size Datasets

Certain datasets contain a limited number of samples with highly various...

Please sign up or login with your details

Forgot password? Click here to reset