Word2vec Skip-gram Dimensionality Selection via Sequential Normalized Maximum Likelihood

08/18/2020
by   Pham Thuc Hung, et al.
0

In this paper, we propose a novel information criteria-based approach to select the dimensionality of the word2vec Skip-gram (SG). From the perspective of the probability theory, SG is considered as an implicit probability distribution estimation under the assumption that there exists a true contextual distribution among words. Therefore, we apply information criteria with the aim of selecting the best dimensionality so that the corresponding model can be as close as possible to the true distribution. We examine the following information criteria for the dimensionality selection problem: the Akaike Information Criterion, Bayesian Information Criterion, and Sequential Normalized Maximum Likelihood (SNML) criterion. SNML is the total codelength required for the sequential encoding of a data sequence on the basis of the minimum description length. The proposed approach is applied to both the original SG model and the SG Negative Sampling model to clarify the idea of using information criteria. Additionally, as the original SNML suffers from computational disadvantages, we introduce novel heuristics for its efficient computation. Moreover, we empirically demonstrate that SNML outperforms both BIC and AIC. In comparison with other evaluation methods for word embedding, the dimensionality selected by SNML is significantly closer to the optimal dimensionality obtained by word analogy or word similarity tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/21/2016

Bayesian Neural Word Embedding

Recently, several works in the domain of natural language processing pre...
research
09/13/2021

Parametric Modeling Approach to COVID-19 Pandemic Data

The problem of skewness is common among clinical trials and survival dat...
research
01/05/2016

The Role of Context Types and Dimensionality in Learning Word Embeddings

We provide the first extensive evaluation of how using different types o...
research
02/17/2021

Contextual Skipgram: Training Word Representation Using Context Information

The skip-gram (SG) model learns word representation by predicting the wo...
research
01/30/2014

A Generalized Probabilistic Framework for Compact Codebook Creation

Compact and discriminative visual codebooks are preferred in many visual...
research
10/25/2019

Descriptive Dimensionality and Its Characterization of MDL-based Learning and Change Detection

This paper introduces a new notion of dimensionality of probabilistic mo...
research
08/10/2017

Automatic Selection of t-SNE Perplexity

t-Distributed Stochastic Neighbor Embedding (t-SNE) is one of the most w...

Please sign up or login with your details

Forgot password? Click here to reset