Streaming Transformer-based Acoustic Models Using Self-attention with Augmented Memory

05/16/2020
by   Chunyang Wu, et al.
0

Transformer-based acoustic modeling has achieved great suc-cess for both hybrid and sequence-to-sequence speech recogni-tion. However, it requires access to the full sequence, and thecomputational cost grows quadratically with respect to the in-put sequence length. These factors limit its adoption for stream-ing applications. In this work, we proposed a novel augmentedmemory self-attention, which attends on a short segment of theinput sequence and a bank of memories. The memory bankstores the embedding information for all the processed seg-ments. On the librispeech benchmark, our proposed methodoutperforms all the existing streamable transformer methods bya large margin and achieved over 15 used LC-BLSTM baseline. Our find-ings are also confirmed on some large internal datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/23/2019

A Transformer with Interleaved Self-attention and Convolution for Hybrid Acoustic Models

Transformer with self-attention has achieved great success in the area o...
research
03/26/2018

Self-Attentional Acoustic Models

Self-attention is a method of encoding sequences of vectors by relating ...
research
10/22/2019

Transformer-based Acoustic Modeling for Hybrid Speech Recognition

We propose and evaluate transformer-based acoustic models (AMs) for hybr...
research
06/14/2023

When to Use Efficient Self Attention? Profiling Text, Speech and Image Transformer Variants

We present the first unified study of the efficiency of self-attention-b...
research
04/17/2020

Highway Transformer: Self-Gating Enhanced Self-Attentive Networks

Self-attention mechanisms have made striking state-of-the-art (SOTA) pro...
research
11/17/2020

s-Transformer: Segment-Transformer for Robust Neural Speech Synthesis

Neural end-to-end text-to-speech (TTS) , which adopts either a recurrent...
research
10/13/2022

Why self-attention is Natural for Sequence-to-Sequence Problems? A Perspective from Symmetries

In this paper, we show that structures similar to self-attention are nat...

Please sign up or login with your details

Forgot password? Click here to reset