Second-order Co-occurrence Sensitivity of Skip-Gram with Negative Sampling

06/06/2019
by   Dominik Schlechtweg, et al.
0

We simulate first- and second-order context overlap and show that Skip-Gram with Negative Sampling is similar to Singular Value Decomposition in capturing second-order co-occurrence information, while Pointwise Mutual Information is agnostic to it. We support the results with an empirical study finding that the models react differently when provided with additional second-order information. Our findings reveal a basic property of Skip-Gram with Negative Sampling and point towards an explanation of its success on a variety of tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/05/2023

A Unification Algorithm for Second-Order Linear Terms

We give an algorithm for the class of second order unification problems ...
research
05/27/2017

word2vec Skip-Gram with Negative Sampling is a Weighted Logistic PCA

We show that the skip-gram formulation of word2vec trained with negative...
research
04/01/2018

Revisiting Skip-Gram Negative Sampling Model with Regularization

We revisit skip-gram negative sampling (SGNS), a popular neural-network ...
research
04/13/2017

Incremental Skip-gram Model with Negative Sampling

This paper explores an incremental training strategy for the skip-gram m...
research
11/20/2014

Linking GloVe with word2vec

The Global Vectors for word representation (GloVe), introduced by Jeffre...
research
10/24/2020

Efficient, Simple and Automated Negative Sampling for Knowledge Graph Embedding

Negative sampling, which samples negative triplets from non-observed one...
research
10/26/2017

Improving Negative Sampling for Word Representation using Self-embedded Features

Although the word-popularity based negative sampler has shown superb per...

Please sign up or login with your details

Forgot password? Click here to reset