BERT has a Mouth, and It Must Speak: BERT as a Markov Random Field Language Model

02/11/2019
by   Alex Wang, et al.
0

We show that BERT (Devlin et al., 2018) is a Markov random field language model. Formulating BERT in this way gives way to a natural procedure to sample sentence from BERT. We sample sentences from BERT and find that it can produce high-quality, fluent generations. Compared to the generations of a traditional left-to-right language model, BERT generates sentences that are more diverse but of slightly worse quality.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/15/2021

SINA-BERT: A pre-trained Language Model for Analysis of Medical Texts in Persian

We have released Sina-BERT, a language model pre-trained on BERT (Devlin...
research
12/17/2018

Conditional BERT Contextual Augmentation

We propose a novel data augmentation method for labeled sentences called...
research
11/09/2019

BERT is Not a Knowledge Base (Yet): Factual Knowledge vs. Name-Based Reasoning in Unsupervised QA

The BERT language model (LM) (Devlin et al., 2019) is surprisingly good ...
research
10/29/2020

Contextual BERT: Conditioning the Language Model Using a Global State

BERT is a popular language model whose main pre-training task is to fill...
research
07/03/2018

Improved training of neural trans-dimensional random field language models with dynamic noise-contrastive estimation

A new whole-sentence language model - neural trans-dimensional random fi...
research
09/21/2020

Latin BERT: A Contextual Language Model for Classical Philology

We present Latin BERT, a contextual language model for the Latin languag...
research
10/27/2019

Thieves on Sesame Street! Model Extraction of BERT-based APIs

We study the problem of model extraction in natural language processing,...

Please sign up or login with your details

Forgot password? Click here to reset