Learning Dynamic Author Representations with Temporal Language Models

by   Edouard Delasalles, et al.

Language models are at the heart of numerous works, notably in the text mining and information retrieval communities. These statistical models aim at extracting word distributions, from simple unigram models to recurrent approaches with latent variables that capture subtle dependencies in texts. However, those models are learned from word sequences only, and authors' identities, as well as publication dates, are seldom considered. We propose a neural model, based on recurrent language modeling, which aims at capturing language diffusion tendencies in author communities through time. By conditioning language models with author and temporal vector states, we are able to leverage the latent dependencies between the text contexts. This allows us to beat several temporal and non-temporal language baselines on two real-world corpora, and to learn meaningful author representations that vary through time.


page 4

page 7


DRAG: Director-Generator Language Modelling Framework for Non-Parallel Author Stylized Rewriting

Author stylized rewriting is the task of rewriting an input text in a pa...

Dating Texts without Explicit Temporal Cues

This paper tackles temporal resolution of documents, such as determining...

Recurrent Hierarchical Topic-Guided Neural Language Models

To simultaneously capture syntax and global semantics from a text corpus...

Time Masking for Temporal Language Models

Our world is constantly evolving, and so is the content on the web. Cons...

Frustratingly Short Attention Spans in Neural Language Modeling

Neural language models predict the next token using a latent representat...

Adapting Language Models for Non-Parallel Author-Stylized Rewriting

Given the recent progress in language modeling using Transformer-based n...

Incorporating Stylistic Lexical Preferences in Generative Language Models

While recent advances in language modeling have resulted in powerful gen...

Code Repositories


Learning Dynamic Author Representations with Temporal Language Models

view repo