Learning dynamic word embeddings with drift regularisation

07/22/2019
by   Syrielle Montariol, et al.
0

Word usage, meaning and connotation change throughout time. Diachronic word embeddings are used to grasp these changes in an unsupervised way. In this paper, we use variants of the Dynamic Bernoulli Embeddings model to learn dynamic word embeddings, in order to identify notable properties of the model. The comparison is made on the New York Times Annotated Corpus in English and a set of articles from the French newspaper Le Monde covering the same period. This allows us to define a pipeline to analyse the evolution of words use across two languages.

READ FULL TEXT
research
10/23/2020

Dynamic Contextualized Word Embeddings

Static word embeddings that represent words by a single vector cannot ca...
research
03/23/2017

Dynamic Bernoulli Embeddings for Language Evolution

Word embeddings are a powerful approach for unsupervised analysis of lan...
research
09/04/2019

Empirical Study of Diachronic Word Embeddings for Scarce Data

Word meaning change can be inferred from drifts of time-varying word emb...
research
10/02/2020

Enriching Word Embeddings with Temporal and Spatial Information

The meaning of a word is closely linked to sociocultural factors that ca...
research
11/13/2020

Learning language variations in news corpora through differential embeddings

There is an increasing interest in the NLP community in capturing variat...
research
02/11/2023

Dialectograms: Machine Learning Differences between Discursive Communities

Word embeddings provide an unsupervised way to understand differences in...
research
02/12/2015

RAND-WALK: A Latent Variable Model Approach to Word Embeddings

Semantic word embeddings represent the meaning of a word via a vector, a...

Please sign up or login with your details

Forgot password? Click here to reset