DeepAI
Log In Sign Up

Time Masking for Temporal Language Models

10/12/2021
by   Guy D. Rosin, et al.
0

Our world is constantly evolving, and so is the content on the web. Consequently, our languages, often said to mirror the world, are dynamic in nature. However, most current contextual language models are static and cannot adapt to changes over time. In this work, we propose a temporal contextual language model called TempoBERT, which uses time as an additional context of texts. Our technique is based on modifying texts with temporal information and performing time masking - specific masking for the supplementary time information. We leverage our approach for the tasks of semantic change detection and sentence time prediction, experimenting on diverse datasets in terms of time, size, genre, and language. Our extensive evaluation shows that both tasks benefit from exploiting time masking.

READ FULL TEXT

page 1

page 2

page 3

page 4

11/23/2021

Using Distributional Principles for the Semantic Study of Contextual Language Models

Many studies were recently done for investigating the properties of cont...
02/03/2021

Pitfalls of Static Language Modelling

Our world is open-ended, non-stationary and constantly evolving; thus wh...
02/04/2022

Temporal Attention for Language Models

Pretrained language models based on the transformer architecture have sh...
09/11/2019

Learning Dynamic Author Representations with Temporal Language Models

Language models are at the heart of numerous works, notably in the text ...
11/10/2012

Dating Texts without Explicit Temporal Cues

This paper tackles temporal resolution of documents, such as determining...
04/12/2022

Do Not Fire the Linguist: Grammatical Profiles Help Language Models Detect Semantic Change

Morphological and syntactic changes in word usage (as captured, e.g., by...
09/09/2021

Filling the Gaps in Ancient Akkadian Texts: A Masked Language Modelling Approach

We present models which complete missing text given transliterations of ...

Code Repositories

tempobert

Code & Data for the Paper "Time Masking for Temporal Language Models", WSDM 2022


view repo