Resource-Efficient Separation Transformer

06/19/2022
by   Cem Subakan, et al.
2

Transformers have recently achieved state-of-the-art performance in speech separation. These models, however, are computationally-demanding and require a lot of learnable parameters. This paper explores Transformer-based speech separation with a reduced computational cost. Our main contribution is the development of the Resource-Efficient Separation Transformer (RE-SepFormer), a self-attention-based architecture that reduces the computational burden in two ways. First, it uses non-overlapping blocks in the latent space. Second, it operates on compact latent summaries calculated from each chunk. The RE-SepFormer reaches a competitive performance on the popular WSJ0-2Mix and WHAM! datasets in both causal and non-causal settings. Remarkably, it scales significantly better than the previous Transformer and RNN-based architectures in terms of memory and inference-time, making it more suitable for processing long mixtures.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/06/2022

On Using Transformers for Speech-Separation

Transformers have enabled major improvements in deep learning. They ofte...
research
02/19/2021

TransMask: A Compact and Fast Speech Separation Model Based on Transformer

Speech separation is an important problem in speech processing, which ta...
research
08/13/2020

Continuous Speech Separation with Conformer

Continuous speech separation plays a vital role in complicated speech re...
research
06/28/2022

Tiny-Sepformer: A Tiny Time-Domain Transformer Network for Speech Separation

Time-domain Transformer neural networks have proven their superiority in...
research
10/25/2020

Attention is All You Need in Speech Separation

Recurrent Neural Networks (RNNs) have long been the dominant architectur...
research
04/27/2022

Ultra Fast Speech Separation Model with Teacher Student Learning

Transformer has been successfully applied to speech separation recently ...
research
11/15/2022

Latent Bottlenecked Attentive Neural Processes

Neural Processes (NPs) are popular methods in meta-learning that can est...

Please sign up or login with your details

Forgot password? Click here to reset