CUE Vectors: Modular Training of Language Models Conditioned on Diverse Contextual Signals

03/16/2022
by   Scott Novotney, et al.
0

We propose a framework to modularize the training of neural language models that use diverse forms of sentence-external context (including metadata) by eliminating the need to jointly train sentence-external and within-sentence encoders. Our approach, contextual universal embeddings (CUE), trains LMs on one set of context, such as date and author, and adapts to novel metadata types, such as article title, or previous sentence. The model consists of a pretrained neural sentence LM, a BERT-based context encoder, and a masked transformer decoder that estimates LM probabilities using sentence-internal and sentence-external information. When context or metadata are unavailable, our model learns to combine contextual and sentence-internal information using noisy oracle unigram embeddings as a proxy. Real contextual information can be introduced later and used to adapt a small number of parameters that map contextual data into the decoder's embedding space. We validate the CUE framework on a NYTimes text corpus with multiple metadata types, for which the LM perplexity can be lowered from 36.6 to 27.4 by conditioning on context. Bootstrapping a contextual LM with only a subset of the context/metadata during training retains 85% of the achievable gain. Training the model initially with proxy context retains 67 context. Furthermore, we can swap one type of pretrained sentence LM for another without retraining the context encoders, by only adapting the decoder model. Overall, we obtain a modular framework that allows incremental, scalable training of context-enhanced LMs.

READ FULL TEXT
research
09/09/2019

Pretrained Language Models for Sequential Sentence Classification

As a step toward better document-level understanding, we explore classif...
research
11/03/2022

Contextual information integration for stance detection via cross-attention

Stance detection deals with the identification of an author's stance tow...
research
08/20/2020

Discovering Useful Sentence Representations from Large Pretrained Language Models

Despite the extensive success of pretrained language models as encoders ...
research
08/31/2019

Evaluation Benchmarks and Learning Criteriafor Discourse-Aware Sentence Representations

Prior work on pretrained sentence embeddings and benchmarks focus on the...
research
04/13/2021

Multiple regression techniques for modeling dates of first performances of Shakespeare-era plays

The date of the first performance of a play of Shakespeare's time must u...
research
02/14/2019

Categorical Metadata Representation for Customized Text Classification

The performance of text classification has improved tremendously using i...
research
11/12/2021

Thermodynamics of Encoding and Encoders

Non-isolated systems have diverse coupling relations with the external e...

Please sign up or login with your details

Forgot password? Click here to reset