SCE: Scalable Network Embedding from Sparsest Cut

by   Shengzhong Zhang, et al.

Large-scale network embedding is to learn a latent representation for each node in an unsupervised manner, which captures inherent properties and structural information of the underlying graph. In this field, many popular approaches are influenced by the skip-gram model from natural language processing. Most of them use a contrastive objective to train an encoder which forces the embeddings of similar pairs to be close and embeddings of negative samples to be far. A key of success to such contrastive learning methods is how to draw positive and negative samples. While negative samples that are generated by straightforward random sampling are often satisfying, methods for drawing positive examples remains a hot topic. In this paper, we propose SCE for unsupervised network embedding only using negative samples for training. Our method is based on a new contrastive objective inspired by the well-known sparsest cut problem. To solve the underlying optimization problem, we introduce a Laplacian smoothing trick, which uses graph convolutional operators as low-pass filters for smoothing node representations. The resulting model consists of a GCN-type structure as the encoder and a simple loss function. Notably, our model does not use positive samples but only negative samples for training, which not only makes the implementation and tuning much easier, but also reduces the training time significantly. Finally, extensive experimental studies on real world data sets are conducted. The results clearly demonstrate the advantages of our new model in both accuracy and scalability compared to strong baselines such as GraphSAGE, G2G and DGI.


page 1

page 2

page 3

page 4


Generating Counterfactual Hard Negative Samples for Graph Contrastive Learning

Graph contrastive learning has emerged as a powerful tool for unsupervis...

UCL-Dehaze: Towards Real-world Image Dehazing via Unsupervised Contrastive Learning

While the wisdom of training an image dehazing model on synthetic hazy d...

Cleora: A Simple, Strong and Scalable Graph Embedding Scheme

The area of graph embeddings is currently dominated by contrastive learn...

Learning from the Dark: Boosting Graph Convolutional Neural Networks with Diverse Negative Samples

Graph Convolutional Neural Networks (GCNs) has been generally accepted t...

Smoothed Contrastive Learning for Unsupervised Sentence Embedding

Contrastive learning has been gradually applied to learn high-quality un...

A Light-Weight Contrastive Approach for Aligning Human Pose Sequences

We present a simple unsupervised method for learning an encoder mapping ...

Learning Determinantal Point Processes by Sampling Inferred Negatives

Determinantal Point Processes (DPPs) have attracted significant interest...

Please sign up or login with your details

Forgot password? Click here to reset