DeepAI
Log In Sign Up

Imputer: Sequence Modelling via Imputation and Dynamic Programming

02/20/2020
by   William Chan, et al.
0

This paper presents the Imputer, a neural sequence model that generates output sequences iteratively via imputations. The Imputer is an iterative generative model, requiring only a constant number of generation steps independent of the number of input or output tokens. The Imputer can be trained to approximately marginalize over all possible alignments between the input and output sequences, and all possible generation orders. We present a tractable dynamic programming training algorithm, which yields a lower bound on the log marginal likelihood. When applied to end-to-end speech recognition, the Imputer outperforms prior non-autoregressive models and achieves competitive results to autoregressive models. On LibriSpeech test-other, the Imputer achieves 11.1 WER, outperforming CTC at 13.0 WER and seq2seq at 12.5 WER.

READ FULL TEXT

page 1

page 2

page 3

page 4

04/16/2020

Non-Autoregressive Machine Translation with Latent Alignments

This paper investigates two latent alignment models for non-autoregressi...
10/24/2020

Align-Refine: Non-Autoregressive Speech Recognition via Iterative Realignment

Non-autoregressive models greatly improve decoding speed over typical se...
05/24/2022

EdiT5: Semi-Autoregressive Text-Editing with T5 Warm-Start

We present EdiT5 - a novel semi-autoregressive text-editing approach des...
10/28/2020

CASS-NAT: CTC Alignment-based Single Step Non-autoregressive Transformer for Speech Recognition

We propose a CTC alignment-based single step non-autoregressive transfor...
06/02/2020

Surprisal-Triggered Conditional Computation with Neural Networks

Autoregressive neural network models have been used successfully for seq...
01/20/2022

Interval-Memoized Backtracking on ZDDs for Fast Enumeration of All Lower Cost Solutions

In this paper, we propose a fast method for exactly enumerating a very l...