Improving Negative Sampling for Word Representation using Self-embedded Features

10/26/2017
by   Long Chen, et al.
0

Although the word-popularity based negative sampler has shown superb performance in the skip-gram model, the theoretical motivation behind oversampling popular (non-observed) words as negative samples is still not well understood. In this paper, we start from an investigation of the gradient vanishing issue in the skip-gram model without a proper negative sampler. By performing an insightful analysis from the stochastic gradient descent (SGD) learning perspective, we demonstrate that, both theoretically and intuitively, negative samples with larger inner product scores are more informative than those with lower scores for the SGD learner in terms of both convergence rate and accuracy. Understanding this, we propose an alternative sampling algorithm that dynamically selects informative negative samples during each SGD update. More importantly, the proposed sampler accounts for multi-dimensional self-embedded features during the sampling process, which essentially makes it more effective than the original popularity-based (one-dimensional) sampler. Empirical experiments further verify our observations, and show that our fine-grained samplers gain significant improvement over the existing ones without increasing computational complexity.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/13/2017

Incremental Skip-gram Model with Negative Sampling

This paper explores an incremental training strategy for the skip-gram m...
research
04/01/2018

Revisiting Skip-Gram Negative Sampling Model with Regularization

We revisit skip-gram negative sampling (SGNS), a popular neural-network ...
research
04/26/2017

Riemannian Optimization for Skip-Gram Negative Sampling

Skip-Gram Negative Sampling (SGNS) word embedding model, well known by i...
research
10/24/2020

Efficient, Simple and Automated Negative Sampling for Knowledge Graph Embedding

Negative sampling, which samples negative triplets from non-observed one...
research
03/22/2022

Provable Constrained Stochastic Convex Optimization with XOR-Projected Gradient Descent

Provably solving stochastic convex optimization problems with constraint...
research
06/06/2019

Second-order Co-occurrence Sensitivity of Skip-Gram with Negative Sampling

We simulate first- and second-order context overlap and show that Skip-G...

Please sign up or login with your details

Forgot password? Click here to reset