Tracking the Evolution of Words with Time-reflective Text Representations

07/12/2018
by   Roberto Camacho Barranco, et al.
0

More than 80 unstructured datasets evolving over time. A large part of the evolving unstructured data are text documents generated by media outlets, scholarly articles in digital libraries, and social media. Vector space models have been developed to analyze text documents using data mining and machine learning algorithms. While ample vector space models exist for text data, the evolution aspect of evolving text corpora is still missing in vector-based representations. The advent of word embeddings has given a way to create a contextual vector space, but the embeddings do not consider the temporal aspects of the feature space successfully yet. The inclusion of the time aspect in the feature space will provide vectors for every natural language element, such as words or entities, at every timestamp. Such temporal word vectors will provide the ability to track how the meaning of a word changes over time, in terms of the changes in its neighborhood. Moreover, a time-reflective text representation will pave the way to a new set of text analytic abilities involving time series for text collections. In this paper, we present the potential benefits of a time-reflective vector space model for temporal text data that is able to capture short and long-term changes in the meaning of words. We compare our approach with the limited literature on dynamic embeddings. We present qualitative and quantitative evaluations using semantic evolution tracking as the target application.

READ FULL TEXT

page 7

page 11

page 12

research
08/06/2016

Desiderata for Vector-Space Word Representations

A plethora of vector-space representations for words is currently availa...
research
07/07/2016

Predicting and Understanding Law-Making with Word Vectors and an Ensemble Model

Out of nearly 70,000 bills introduced in the U.S. Congress from 2001 to ...
research
03/02/2017

Discovery of Evolving Semantics through Dynamic Word Embedding Learning

During the course of human language evolution, the semantic meanings of ...
research
02/05/2015

Monitoring Term Drift Based on Semantic Consistency in an Evolving Vector Field

Based on the Aristotelian concept of potentiality vs. actuality allowing...
research
05/01/2016

Text-mining the NeuroSynth corpus using Deep Boltzmann Machines

Large-scale automated meta-analysis of neuroimaging data has recently es...
research
08/09/2022

Representation learning of rare temporal conditions for travel time prediction

Predicting travel time under rare temporal conditions (e.g., public holi...
research
09/21/2016

Gov2Vec: Learning Distributed Representations of Institutions and Their Legal Text

We compare policy differences across institutions by embedding represent...

Please sign up or login with your details

Forgot password? Click here to reset