Tiny-Sepformer: A Tiny Time-Domain Transformer Network for Speech Separation

06/28/2022
by   Jian Luo, et al.
0

Time-domain Transformer neural networks have proven their superiority in speech separation tasks. However, these models usually have a large number of network parameters, thus often encountering the problem of GPU memory explosion. In this paper, we proposed Tiny-Sepformer, a tiny version of Transformer network for speech separation. We present two techniques to reduce the model parameters and memory consumption: (1) Convolution-Attention (CA) block, spliting the vanilla Transformer to two paths, multi-head attention and 1D depthwise separable convolution, (2) parameter sharing, sharing the layer parameters within the CA block. In our experiments, Tiny-Sepformer could greatly reduce the model size, and achieves comparable separation performance with vanilla Sepformer on WSJ0-2/3Mix datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/13/2020

Continuous Speech Separation with Conformer

Continuous speech separation plays a vital role in complicated speech re...
research
02/19/2021

TransMask: A Compact and Fast Speech Separation Model Based on Transformer

Speech separation is an important problem in speech processing, which ta...
research
07/28/2020

Dual-Path Transformer Network: Direct Context-Aware Modeling for End-to-End Monaural Speech Separation

The dominant speech separation models are based on complex recurrent or ...
research
10/31/2019

Parameter Sharing Decoder Pair for Auto Composing

Auto Composing is an active and appealing research area in the past few ...
research
06/19/2022

Resource-Efficient Separation Transformer

Transformers have recently achieved state-of-the-art performance in spee...
research
04/11/2023

Sim-T: Simplify the Transformer Network by Multiplexing Technique for Speech Recognition

In recent years, a great deal of attention has been paid to the Transfor...
research
08/30/2023

Dual-path Transformer Based Neural Beamformer for Target Speech Extraction

Neural beamformers, which integrate both pre-separation and beamforming ...

Please sign up or login with your details

Forgot password? Click here to reset