Understanding Negative Sampling in Graph Representation Learning

05/20/2020
by   Zhen Yang, et al.
24

Graph representation learning has been extensively studied in recent years. Despite its potential in generating continuous embeddings for various networks, both the effectiveness and efficiency to infer high-quality representations toward large corpus of nodes are still challenging. Sampling is a critical point to achieve the performance goals. Prior arts usually focus on sampling positive node pairs, while the strategy for negative sampling is left insufficiently explored. To bridge the gap, we systematically analyze the role of negative sampling from the perspectives of both objective and risk, theoretically demonstrating that negative sampling is as important as positive sampling in determining the optimization objective and the resulted variance. To the best of our knowledge, we are the first to derive the theory and quantify that the negative sampling distribution should be positively but sub-linearly correlated to their positive sampling distribution. With the guidance of the theory, we propose MCNS, approximating the positive distribution with self-contrast approximation and accelerating negative sampling by Metropolis-Hastings. We evaluate our method on 5 datasets that cover extensive downstream graph learning tasks, including link prediction, node classification and personalized recommendation, on a total of 19 experimental settings. These relatively comprehensive experimental results demonstrate its robustness and superiorities.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/02/2020

Maximizing Cohesion and Separation in Graph Representation Learning: A Distance-aware Negative Sampling Approach

The objective of unsupervised graph representation learning (GRL) is to ...
research
01/27/2021

Improving Graph Representation Learning by Contrastive Regularization

Graph representation learning is an important task with applications in ...
research
03/09/2022

Language Model-driven Negative Sampling

Knowledge Graph Embeddings (KGEs) encode the entities and relations of a...
research
12/05/2022

Graph Convolutional Neural Networks with Diverse Negative Samples via Decomposed Determinant Point Processes

Graph convolutional networks (GCNs) have achieved great success in graph...
research
10/21/2019

Improving Word Representations: A Sub-sampled Unigram Distribution for Negative Sampling

Word2Vec is the most popular model for word representation and has been ...
research
07/04/2019

Dimensional Reweighting Graph Convolutional Networks

Graph Convolution Networks (GCNs) are becoming more and more popular for...
research
02/03/2021

Cleora: A Simple, Strong and Scalable Graph Embedding Scheme

The area of graph embeddings is currently dominated by contrastive learn...

Please sign up or login with your details

Forgot password? Click here to reset