BERT has a Mouth, and It Must Speak: BERT as a Markov Random Field Language Model

02/11/2019
by   Alex Wang, et al.
0

We show that BERT (Devlin et al., 2018) is a Markov random field language model. Formulating BERT in this way gives way to a natural procedure to sample sentence from BERT. We sample sentences from BERT and find that it can produce high-quality, fluent generations. Compared to the generations of a traditional left-to-right language model, BERT generates sentences that are more diverse but of slightly worse quality.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset