Pitfalls of Static Language Modelling

02/03/2021
by   Angeliki Lazaridou, et al.
0

Our world is open-ended, non-stationary and constantly evolving; thus what we talk about and how we talk about it changes over time. This inherent dynamic nature of language comes in stark contrast to the current static language modelling paradigm, which constructs training and evaluation sets from overlapping time periods. Despite recent progress, we demonstrate that state-of-the-art Transformer models perform worse in the realistic setup of predicting future utterances from beyond their training period – a consistent pattern across three datasets from two domains. We find that, while increasing model size alone – a key driver behind recent progress – does not provide a solution for the temporal generalization problem, having models that continually update their knowledge with new information can indeed slow down the degradation over time. Hence, given the compilation of ever-larger language modelling training datasets, combined with the growing list of language-model-based NLP applications that require up-to-date knowledge about the world, we argue that now is the right time to rethink our static language modelling evaluation protocol, and develop adaptive language models that can remain up-to-date with respect to our ever-changing and non-stationary world.

READ FULL TEXT

page 1

page 2

page 3

page 4

10/12/2021

Time Masking for Temporal Language Models

Our world is constantly evolving, and so is the content on the web. Cons...
11/11/2017

State space models for non-stationary intermittently coupled systems

Many time series exhibit non-stationary behaviour that can be explained ...
12/10/2020

Multi-Sense Language Modelling

The effectiveness of a language model is influenced by its token represe...
12/16/2021

Reconsidering the Past: Optimizing Hidden States in Language Models

We present Hidden-State Optimization (HSO), a gradient-based method for ...
04/12/2022

What do Toothbrushes do in the Kitchen? How Transformers Think our World is Structured

Transformer-based models are now predominant in NLP. They outperform app...
05/18/2021

DRILL: Dynamic Representations for Imbalanced Lifelong Learning

Continual or lifelong learning has been a long-standing challenge in mac...