Pitfalls of Static Language Modelling

02/03/2021
by   Angeliki Lazaridou, et al.
0

Our world is open-ended, non-stationary and constantly evolving; thus what we talk about and how we talk about it changes over time. This inherent dynamic nature of language comes in stark contrast to the current static language modelling paradigm, which constructs training and evaluation sets from overlapping time periods. Despite recent progress, we demonstrate that state-of-the-art Transformer models perform worse in the realistic setup of predicting future utterances from beyond their training period – a consistent pattern across three datasets from two domains. We find that, while increasing model size alone – a key driver behind recent progress – does not provide a solution for the temporal generalization problem, having models that continually update their knowledge with new information can indeed slow down the degradation over time. Hence, given the compilation of ever-larger language modelling training datasets, combined with the growing list of language-model-based NLP applications that require up-to-date knowledge about the world, we argue that now is the right time to rethink our static language modelling evaluation protocol, and develop adaptive language models that can remain up-to-date with respect to our ever-changing and non-stationary world.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/12/2021

Time Masking for Temporal Language Models

Our world is constantly evolving, and so is the content on the web. Cons...
research
11/11/2017

State space models for non-stationary intermittently coupled systems

Many time series exhibit non-stationary behaviour that can be explained ...
research
02/23/2023

Dynamic Benchmarking of Masked Language Models on Temporal Concept Drift with Multiple Views

Temporal concept drift refers to the problem of data changing over time....
research
11/21/2022

Deanthropomorphising NLP: Can a Language Model Be Conscious?

This work is intended as a voice in the discussion over the recent claim...
research
12/10/2020

Multi-Sense Language Modelling

The effectiveness of a language model is influenced by its token represe...
research
05/18/2021

DRILL: Dynamic Representations for Imbalanced Lifelong Learning

Continual or lifelong learning has been a long-standing challenge in mac...
research
01/09/2023

Removing Non-Stationary Knowledge From Pre-Trained Language Models for Entity-Level Sentiment Classification in Finance

Extraction of sentiment signals from news text, stock message boards, an...

Please sign up or login with your details

Forgot password? Click here to reset