Investigation of Using VAE for i-Vector Speaker Verification

05/25/2017
by   Timur Pekhovsky, et al.
0

New system for i-vector speaker recognition based on variational autoencoder (VAE) is investigated. VAE is a promising approach for developing accurate deep nonlinear generative models of complex data. Experiments show that VAE provides speaker embedding and can be effectively trained in an unsupervised manner. LLR estimate for VAE is developed. Experiments on NIST SRE 2010 data demonstrate its correctness. Additionally, we show that the performance of VAE-based system in the i-vectors space is close to that of the diagonal PLDA. Several interesting results are also observed in the experiments with β-VAE. In particular, we found that for β≪ 1, VAE can be trained to capture the features of complex input data distributions in an effective way, which is hard to obtain in the standard VAE (β=1).

READ FULL TEXT
research
04/03/2020

Epitomic Variational Graph Autoencoder

Variational autoencoder (VAE) is a widely used generative model for unsu...
research
09/06/2017

Symmetric Variational Autoencoder and Connections to Adversarial Learning

A new form of the variational autoencoder (VAE) is proposed, based on th...
research
02/17/2018

Interpretable VAEs for nonlinear group factor analysis

Deep generative models have recently yielded encouraging results in prod...
research
10/24/2021

Discrete acoustic space for an efficient sampling in neural text-to-speech

We present an SVQ-VAE architecture using a split vector quantizer for NT...
research
10/20/2019

Neuro-SERKET: Development of Integrative Cognitive System through the Composition of Deep Probabilistic Generative Models

This paper describes a framework for the development of an integrative c...
research
12/06/2018

β-VAEs can retain label information even at high compression

In this paper, we investigate the degree to which the encoding of a β-VA...
research
02/13/2020

Neuromorphologicaly-preserving Volumetric data encoding using VQ-VAE

The increasing efficiency and compactness of deep learning architectures...

Please sign up or login with your details

Forgot password? Click here to reset