Enriching Word Embeddings with Temporal and Spatial Information

10/02/2020
by   Hongyu Gong, et al.
0

The meaning of a word is closely linked to sociocultural factors that can change over time and location, resulting in corresponding meaning changes. Taking a global view of words and their meanings in a widely used language, such as English, may require us to capture more refined semantics for use in time-specific or location-aware situations, such as the study of cultural trends or language use. However, popular vector representations for words do not adequately include temporal or spatial information. In this work, we present a model for learning word representation conditioned on time and location. In addition to capturing meaning changes over time and location, we require that the resulting word embeddings retain salient semantic and geometric properties. We train our model on time- and location-stamped corpora, and show using both quantitative and qualitative evaluations that it can capture semantics across time and locations. We note that our model compares favorably with the state-of-the-art for time-specific embedding, and serves as a new benchmark for location-specific embeddings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/22/2019

Learning dynamic word embeddings with drift regularisation

Word usage, meaning and connotation change throughout time. Diachronic w...
research
03/02/2017

Discovery of Evolving Semantics through Dynamic Word Embedding Learning

During the course of human language evolution, the semantic meanings of ...
research
04/13/2020

Compass-aligned Distributional Embeddings for Studying Semantic Differences across Corpora

Word2vec is one of the most used algorithms to generate word embeddings ...
research
10/15/2022

Temporal Word Meaning Disambiguation using TimeLMs

Meaning of words constantly changes given the events in modern civilizat...
research
06/05/2019

Training Temporal Word Embeddings with a Compass

Temporal word embeddings have been proposed to support the analysis of w...
research
11/13/2020

Learning language variations in news corpora through differential embeddings

There is an increasing interest in the NLP community in capturing variat...
research
05/15/2023

Unsupervised Semantic Variation Prediction using the Distribution of Sibling Embeddings

Languages are dynamic entities, where the meanings associated with words...

Please sign up or login with your details

Forgot password? Click here to reset