Discovery of Evolving Semantics through Dynamic Word Embedding Learning

03/02/2017
by   Zijun Yao, et al.
0

During the course of human language evolution, the semantic meanings of words keep evolving with time. The understanding of evolving semantics enables us to capture the true meaning of the words in different usage contexts, and thus is critical for various applications, such as machine translation. While it is naturally promising to study word semantics in a time-aware manner, traditional methods to learn word vector representation do not adequately capture the change over time. To this end, in this paper, we aim at learning time-aware vector representation of words through dynamic word embedding modeling. Specifically, we first propose a method that captures time-specific semantics and across-time alignment simultaneously in a way that is robust to data sparsity. Then, we solve the resulting optimization problem using a scalable coordinate descent method. Finally, we perform the empirical study on New York Times data to learn the temporal embeddings and develop multiple evaluations that illustrate the semantic evolution of words, discovered from news media. Moreover, our qualitative and quantitative tests indicate that the our method not only reliably captures the semantic evolution over time, but also onsistently outperforms state-of-the-art temporal embedding approaches on both semantic accuracy and alignment quality.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/02/2020

Enriching Word Embeddings with Temporal and Spatial Information

The meaning of a word is closely linked to sociocultural factors that ca...
research
06/20/2020

Learning aligned embeddings for semi-supervised word translation using Maximum Mean Discrepancy

Word translation is an integral part of language translation. In machine...
research
07/12/2018

Tracking the Evolution of Words with Time-reflective Text Representations

More than 80 unstructured datasets evolving over time. A large part of t...
research
04/28/2020

Autoencoding Word Representations through Time for Semantic Change Detection

Semantic change detection concerns the task of identifying words whose m...
research
10/19/2019

An Improved Historical Embedding without Alignment

Many words have evolved in meaning as a result of cultural and social ch...
research
03/18/2021

Impressions2Font: Generating Fonts by Specifying Impressions

Various fonts give us various impressions, which are often represented b...
research
12/04/2018

Twitter-based traffic information system based on vector representations for words

Recently, researchers have shown an increased interest in harnessing Twi...

Please sign up or login with your details

Forgot password? Click here to reset