Supervised Attention in Sequence-to-Sequence Models for Speech Recognition

04/25/2022
by   Gene-Ping Yang, et al.
0

Attention mechanism in sequence-to-sequence models is designed to model the alignments between acoustic features and output tokens in speech recognition. However, attention weights produced by models trained end to end do not always correspond well with actual alignments, and several studies have further argued that attention weights might not even correspond well with the relevance attribution of frames. Regardless, visual similarity between attention weights and alignments is widely used during training as an indicator of the models quality. In this paper, we treat the correspondence between attention weights and alignments as a learning problem by imposing a supervised attention loss. Experiments have shown significant improved performance, suggesting that learning the alignments well during training critically determines the performance of sequence-to-sequence models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/16/2017

An online sequence-to-sequence model for noisy speech recognition

Generative models have long been the dominant approach for speech recogn...
research
09/21/2021

On the Difficulty of Segmenting Words with Attention

Word segmentation, the problem of finding word boundaries in speech, is ...
research
10/08/2021

Explaining the Attention Mechanism of End-to-End Speech Recognition Using Decision Trees

The attention mechanism has largely improved the performance of end-to-e...
research
09/14/2019

Integrating Source-channel and Attention-based Sequence-to-sequence Models for Speech Recognition

This paper proposes a novel automatic speech recognition (ASR) framework...
research
12/05/2017

Multi-Dialect Speech Recognition With A Single Sequence-To-Sequence Model

Sequence-to-sequence models provide a simple and elegant solution for bu...
research
08/03/2016

Learning Online Alignments with Continuous Rewards Policy Gradient

Sequence-to-sequence models with soft attention had significant success ...
research
05/23/2019

Copy this Sentence

Attention is an operation that selects some largest element from some se...

Please sign up or login with your details

Forgot password? Click here to reset