One of the challenges for current sequence to sequence (seq2seq) models ...
Building open-domain chatbots is a challenging area for machine learning...
This paper demonstrates that multilingual denoising pre-training produce...
We present BART, a denoising autoencoder for pretraining sequence-to-seq...
Language model pretraining has led to significant performance gains but
...
We present SpanBERT, a pre-training method that is designed to better
re...
Most machine translation systems generate text autoregressively, by
sequ...
We present a new approach for pretraining a bi-directional transformer m...