DeepAI AI Chat
Log In Sign Up

Language Models not just for Pre-training: Fast Online Neural Noisy Channel Modeling

by   Shruti Bhosale, et al.

Pre-training models on vast quantities of unlabeled data has emerged as an effective approach to improving accuracy on many NLP tasks. On the other hand, traditional machine translation has a long history of leveraging unlabeled data through noisy channel modeling. The same idea has recently been shown to achieve strong improvements for neural machine translation. Unfortunately, naïve noisy channel modeling with modern sequence to sequence models is up to an order of magnitude slower than alternatives. We address this issue by introducing efficient approximations to make inference with the noisy channel approach as fast as strong ensembles while increasing accuracy. We also show that the noisy channel approach can outperform strong pre-training results by achieving a new state of the art on WMT Romanian-English translation.


page 1

page 2

page 3

page 4


Simple and Effective Noisy Channel Modeling for Neural Machine Translation

Previous work on neural noisy channel modeling relied on latent variable...

JASS: Japanese-specific Sequence to Sequence Pre-training for Neural Machine Translation

Neural machine translation (NMT) needs large parallel corpora for state-...

Profile Prediction: An Alignment-Based Pre-Training Task for Protein Sequence Models

For protein sequence datasets, unlabeled data has greatly outpaced label...

The Neural Noisy Channel

We formulate sequence to sequence transduction as a noisy channel decodi...

Synthetic Pre-Training Tasks for Neural Machine Translation

Pre-training is an effective technique for ensuring robust performance o...

Improving Grammatical Error Correction via Pre-Training a Copy-Augmented Architecture with Unlabeled Data

Neural machine translation systems have become state-of-the-art approach...

Amortized Noisy Channel Neural Machine Translation

Noisy channel models have been especially effective in neural machine tr...