DeepAI AI Chat
Log In Sign Up

Improving noise robustness of automatic speech recognition via parallel data and teacher-student learning

01/05/2019
by   Ladislav Mošner, et al.
Brno University of Technology
Amazon
6

For real-world speech recognition applications, noise robustness is still a challenge. In this work, we adopt the teacher-student (T/S) learning technique using a parallel clean and noisy corpus for improving automatic speech recognition (ASR) performance under multimedia noise. On top of that, we apply a logits selection method which only preserves the k highest values to prevent wrong emphasis of knowledge from the teacher and to reduce bandwidth needed for transferring data. We incorporate up to 8000 hours of untranscribed data for training and present our results on sequence trained models apart from cross entropy trained ones. The best sequence trained student model yields relative word error rate (WER) reductions of approximately 10.1 clean, simulated noisy and real test sets respectively comparing to a sequence trained teacher.

READ FULL TEXT

page 1

page 2

page 3

page 4

05/21/2023

On the Efficacy and Noise-Robustness of Jointly Learned Speech Emotion and Automatic Speech Recognition

New-age conversational agent systems perform both speech emotion recogni...
05/10/2021

Voice activity detection in the wild: A data-driven approach using teacher-student training

Voice activity detection is an essential pre-processing component for sp...
08/17/2022

Analyzing Robustness of End-to-End Neural Models for Automatic Speech Recognition

We investigate robustness properties of pre-trained neural models for au...
09/06/2020

Non causal deep learning based dereverberation

In this paper we demonstrate the effectiveness of non-causal context for...
04/02/2019

Unsupervised training of neural mask-based beamforming

We present an unsupervised training approach for a neural network-based ...
11/09/2022

Improving Noisy Student Training on Non-target Domain Data for Automatic Speech Recognition

Noisy Student Training (NST) has recently demonstrated extremely strong ...