Enhancing into the codec: Noise Robust Speech Coding with Vector-Quantized Autoencoders

02/12/2021
by   Jonah Casebeer, et al.
18

Audio codecs based on discretized neural autoencoders have recently been developed and shown to provide significantly higher compression levels for comparable quality speech output. However, these models are tightly coupled with speech content, and produce unintended outputs in noisy conditions. Based on VQ-VAE autoencoders with WaveRNN decoders, we develop compressor-enhancer encoders and accompanying decoders, and show that they operate well in noisy conditions. We also observe that a compressor-enhancer model performs better on clean speech inputs than a compressor model trained only on clean speech.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/10/2019

Robust Unsupervised Audio-visual Speech Enhancement Using a Mixture of Variational Autoencoders

Recently, an audio-visual speech generative model based on variational a...
research
06/16/2019

Parametric Resynthesis with neural vocoders

Noise suppression systems generally produce output speech with copromise...
research
02/17/2021

Variational Autoencoder for Speech Enhancement with a Noise-Aware Encoder

Recently, a generative variational autoencoder (VAE) has been proposed f...
research
07/07/2022

NESC: Robust Neural End-2-End Speech Coding with GANs

Neural networks have proven to be a formidable tool to tackle the proble...
research
02/23/2021

Handling Background Noise in Neural Speech Generation

Recent advances in neural-network based generative modeling of speech ha...
research
05/16/2020

Improved Prosody from Learned F0 Codebook Representations for VQ-VAE Speech Waveform Reconstruction

Vector Quantized Variational AutoEncoders (VQ-VAE) are a powerful repres...
research
12/04/2022

Generative Models for Improved Naturalness, Intelligibility, and Voicing of Whispered Speech

This work adapts two recent architectures of generative models and evalu...

Please sign up or login with your details

Forgot password? Click here to reset