Promising Accurate Prefix Boosting for sequence-to-sequence ASR

11/07/2018
by   Murali Karthick Baskar, et al.
0

In this paper, we present promising accurate prefix boosting (PAPB), a discriminative training technique for attention based sequence-to-sequence (seq2seq) ASR. PAPB is devised to unify the training and testing scheme in an effective manner. The training procedure involves maximizing the score of each partial correct sequence obtained during beam search compared to other hypotheses. The training objective also includes minimization of token (character) error rate. PAPB shows its efficacy by achieving 10.8% and 3.8% WER with and without RNNLM respectively on Wall Street Journal dataset.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/05/2017

Minimum Word Error Rate Training for Attention-based Sequence-to-Sequence Models

Sequence-to-sequence models, such as attention-based models in automatic...
research
11/12/2018

Sequence-Level Knowledge Distillation for Model Compression of Attention-based Sequence-to-Sequence Speech Recognition

We investigate the feasibility of sequence-level knowledge distillation ...
research
11/11/2022

Exploring Sequence-to-Sequence Transformer-Transducer Models for Keyword Spotting

In this paper, we present a novel approach to adapt a sequence-to-sequen...
research
10/02/2018

Optimal Completion Distillation for Sequence Learning

We present Optimal Completion Distillation (OCD), a training procedure f...
research
09/14/2019

Integrating Source-channel and Attention-based Sequence-to-sequence Models for Speech Recognition

This paper proposes a novel automatic speech recognition (ASR) framework...
research
05/20/2018

Learning compositionally through attentive guidance

In this paper, we introduce Attentive Guidance (AG), a new mechanism to ...
research
08/03/2017

Sensor Transformation Attention Networks

Recent work on encoder-decoder models for sequence-to-sequence mapping h...

Please sign up or login with your details

Forgot password? Click here to reset