Private Language Model Adaptation for Speech Recognition

09/28/2021
by   Zhe Liu, et al.
0

Speech model adaptation is crucial to handle the discrepancy between server-side proxy training data and actual data received on users' local devices. With the use of federated learning (FL), we introduce an efficient approach on continuously adapting neural network language models (NNLMs) on private devices with applications on automatic speech recognition (ASR). To address the potential speech transcription errors in the on-device training corpus, we perform empirical studies on comparing various strategies of leveraging token-level confidence scores to improve the NNLM quality in the FL settings. Experiments show that compared with no model adaptation, the proposed method achieves relative 2.6 speech evaluation datasets, respectively. We also provide analysis in evaluating privacy guarantees of our presented procedure.

READ FULL TEXT
research
12/01/2020

Federated Marginal Personalization for ASR Rescoring

We introduce federated marginal personalization (FMP), a novel method fo...
research
06/06/2022

FedNST: Federated Noisy Student Training for Automatic Speech Recognition

Federated Learning (FL) enables training state-of-the-art Automatic Spee...
research
09/30/2021

Federated Learning in ASR: Not as Easy as You Think

With the growing availability of smart devices and cloud services, perso...
research
11/06/2021

Privacy attacks for automatic speech recognition acoustic models in a federated learning framework

This paper investigates methods to effectively retrieve speaker informat...
research
04/29/2021

End-to-End Speech Recognition from Federated Acoustic Models

Training Automatic Speech Recognition (ASR) models under federated learn...
research
11/09/2022

Adaptive Multi-Corpora Language Model Training for Speech Recognition

Neural network language model (NNLM) plays an essential role in automati...
research
12/20/2014

Incremental Adaptation Strategies for Neural Network Language Models

It is today acknowledged that neural network language models outperform ...

Please sign up or login with your details

Forgot password? Click here to reset