Sparse Architectures for Text-Independent Speaker Verification Using Deep Neural Networks

05/19/2018
by   Sara Sedighi, et al.
0

Network pruning is of great importance due to the elimination of the unimportant weights or features activated due to the network over-parametrization. Advantages of sparsity enforcement include preventing the overfitting and speedup. Considering a large number of parameters in deep architectures, network compression becomes of critical importance due to the required huge amount of computational power. In this work, we impose structured sparsity for speaker verification which is the validation of the query speaker compared to the speaker gallery. We will show that the mere sparsity enforcement can improve the verification results due to the possible initial overfitting in the network.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/14/2020

Quantisation and Pruning for Neural Network Compression and Regularisation

Deep neural networks are typically too computationally expensive to run ...
research
01/07/2019

GASL: Guided Attention for Sparsity Learning in Deep Neural Networks

The main goal of network pruning is imposing sparsity on the neural netw...
research
04/06/2021

Binary Neural Network for Speaker Verification

Although deep neural networks are successful for many tasks in the speec...
research
05/10/2017

Deep Speaker Feature Learning for Text-independent Speaker Verification

Recently deep neural networks (DNNs) have been used to learn speaker fea...
research
11/05/2018

How to Improve Your Speaker Embeddings Extractor in Generic Toolkits

Recently, speaker embeddings extracted with deep neural networks became ...
research
02/10/2022

Learnable Nonlinear Compression for Robust Speaker Verification

In this study, we focus on nonlinear compression methods in spectral fea...
research
11/03/2020

Small footprint Text-Independent Speaker Verification for Embedded Systems

Deep neural network approaches to speaker verification have proven succe...

Please sign up or login with your details

Forgot password? Click here to reset