Attention is All You Need in Speech Separation

10/25/2020
by   Cem Subakan, et al.
38

Recurrent Neural Networks (RNNs) have long been the dominant architecture in sequence-to-sequence learning. RNNs, however, are inherently sequential models that do not allow parallelization of their computations. Transformers are emerging as a natural alternative to standard RNNs, replacing recurrent computations with a multi-head attention mechanism. In this paper, we propose the `SepFormer', a novel RNN-free Transformer-based neural network for speech separation. The SepFormer learns short and long-term dependencies with a multi-scale approach that employs transformers. The proposed model matches or overtakes the state-of-the-art (SOTA) performance on the standard WSJ0-2/3mix datasets. It indeed achieves an SI-SNRi of 20.2 dB on WSJ0-2mix matching the SOTA, and an SI-SNRi of 17.6 dB on WSJ0-3mix, a SOTA result. The SepFormer inherits the parallelization advantages of Transformers and achieves a competitive performance even when downsampling the encoded representation by a factor of 8. It is thus significantly faster and it is less memory-demanding than the latest RNN-based systems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/05/2017

Dilated Recurrent Neural Networks

Learning with recurrent neural networks (RNNs) on long sequences is a no...
research
07/06/2020

Depthwise Separable Convolutions Versus Recurrent Neural Networks for Monaural Singing Voice Separation

Recent approaches for music source separation are almost exclusively bas...
research
05/11/2021

Hierarchical RNNs-Based Transformers MADDPG for Mixed Cooperative-Competitive Environments

At present, attention mechanism has been widely applied to the fields of...
research
07/05/2023

Facing off World Model Backbones: RNNs, Transformers, and S4

World models are a fundamental component in model-based reinforcement le...
research
12/29/2022

Unsupervised construction of representations for oil wells via Transformers

Determining and predicting reservoir formation properties for newly dril...
research
04/28/2020

CmnRec: Sequential Recommendations with Chunk-accelerated Memory Network

Recently, Memory-based Neural Recommenders (MNR) have demonstrated super...
research
06/19/2022

Resource-Efficient Separation Transformer

Transformers have recently achieved state-of-the-art performance in spee...

Please sign up or login with your details

Forgot password? Click here to reset