Contextual BERT: Conditioning the Language Model Using a Global State

10/29/2020
by   Timo I. Denk, et al.
0

BERT is a popular language model whose main pre-training task is to fill in the blank, i.e., predicting a word that was masked out of a sentence, based on the remaining words. In some applications, however, having an additional context can help the model make the right prediction, e.g., by taking the domain or the time of writing into account. This motivates us to advance the BERT architecture by adding a global state for conditioning on a fixed-sized context. We present our two novel approaches and apply them to an industry use-case, where we complete fashion outfits with missing articles, conditioned on a specific customer. An experimental comparison to other methods from the literature shows that our methods improve personalization significantly.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/21/2020

Latin BERT: A Contextual Language Model for Classical Philology

We present Latin BERT, a contextual language model for the Latin languag...
research
05/13/2022

Improving Contextual Representation with Gloss Regularized Pre-training

Though achieving impressive results on many NLP tasks, the BERT-like mas...
research
02/11/2019

BERT has a Mouth, and It Must Speak: BERT as a Markov Random Field Language Model

We show that BERT (Devlin et al., 2018) is a Markov random field languag...
research
12/17/2018

Conditional BERT Contextual Augmentation

We propose a novel data augmentation method for labeled sentences called...
research
10/08/2020

Masked ELMo: An evolution of ELMo towards fully contextual RNN language models

This paper presents Masked ELMo, a new RNN-based model for language mode...
research
06/27/2019

Inducing Syntactic Trees from BERT Representations

We use the English model of BERT and explore how a deletion of one word ...
research
12/16/2020

Focusing More on Conflicts with Mis-Predictions Helps Language Pre-Training

In this work, we propose to improve the effectiveness of language pre-tr...

Please sign up or login with your details

Forgot password? Click here to reset