The Neural Noisy Channel

11/08/2016
by   Lei Yu, et al.
0

We formulate sequence to sequence transduction as a noisy channel decoding problem and use recurrent neural networks to parameterise the source and channel models. Unlike direct models which can suffer from explaining-away effects during training, noisy channel models must produce outputs that explain their inputs, and their component models can be trained with not only paired training samples but also unpaired samples from the marginal output distribution. Using a latent variable to control how much of the conditioning sequence the channel model needs to read in order to generate a subsequent symbol, we obtain a tractable and effective beam search decoder. Experimental results on abstractive sentence summarisation, morphological inflection, and machine translation show that noisy channel models outperform direct models, and that they significantly benefit from increased amounts of unpaired output data that direct models cannot easily use.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/15/2019

Simple and Effective Noisy Channel Modeling for Neural Machine Translation

Previous work on neural noisy channel modeling relied on latent variable...
research
03/18/2021

Pretraining the Noisy Channel Model for Task-Oriented Dialogue

Direct decoding for task-oriented dialogue is known to suffer from the e...
research
11/13/2020

Language Models not just for Pre-training: Fast Online Neural Noisy Channel Modeling

Pre-training models on vast quantities of unlabeled data has emerged as ...
research
09/26/2016

Online Segment to Segment Neural Transduction

We introduce an online neural sequence to sequence model that learns to ...
research
09/21/2020

Target Conditioning for One-to-Many Generation

Neural Machine Translation (NMT) models often lack diversity in their ge...
research
05/14/2019

Sparse Sequence-to-Sequence Models

Sequence-to-sequence models are a powerful workhorse of NLP. Most varian...
research
09/19/2018

Latent Topic Conversational Models

Latent variable models have been a preferred choice in conversational mo...

Please sign up or login with your details

Forgot password? Click here to reset