Redefining Context Windows for Word Embedding Models: An Experimental Study

04/19/2017
by   Pierre Lison, et al.
0

Distributional semantic models learn vector representations of words through the contexts they occur in. Although the choice of context (which often takes the form of a sliding window) has a direct influence on the resulting embeddings, the exact role of this model component is still not fully understood. This paper presents a systematic analysis of context windows based on a set of four distinct hyper-parameters. We train continuous Skip-Gram models on two English-language corpora for various combinations of these hyper-parameters, and evaluate them on both lexical similarity and analogy tasks. Notable experimental results are the positive impact of cross-sentential contexts and the surprisingly good performance of right-context windows.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/11/2018

Evaluating Word Embedding Hyper-Parameters for Similarity and Analogy Tasks

The versatility of word embeddings for various applications is attractin...
research
10/26/2020

Robust and Consistent Estimation of Word Embedding for Bangla Language by fine-tuning Word2Vec Model

Word embedding or vector representation of word holds syntactical and se...
research
04/01/2019

Syntactic Interchangeability in Word Embedding Models

Nearest neighbors in word embedding models are commonly observed to be s...
research
02/27/2017

Dynamic Word Embeddings

We present a probabilistic language model for time-stamped text data whi...
research
09/30/2022

Synonym Detection Using Syntactic Dependency And Neural Embeddings

Recent advances on the Vector Space Model have significantly improved so...
research
06/25/2021

Exploring the Representation of Word Meanings in Context: A Case Study on Homonymy and Synonymy

This paper presents a multilingual study of word meaning representations...

Please sign up or login with your details

Forgot password? Click here to reset