CoLLD: Contrastive Layer-to-layer Distillation for Compressing Multilingual Pre-trained Speech Encoders

09/14/2023
by   Heng-Jui Chang, et al.
0

Large-scale self-supervised pre-trained speech encoders outperform conventional approaches in speech recognition and translation tasks. Due to the high cost of developing these large models, building new encoders for new tasks and deploying them to on-device applications are infeasible. Prior studies propose model compression methods to address this issue, but those works focus on smaller models and less realistic tasks. Thus, we propose Contrastive Layer-to-layer Distillation (CoLLD), a novel knowledge distillation method to compress pre-trained speech encoders by leveraging masked prediction and contrastive learning to train student models to copy the behavior of a large teacher model. CoLLD outperforms prior methods and closes the gap between small and large models on multilingual speech-to-text translation and recognition benchmarks.

READ FULL TEXT
research
03/07/2023

Adaptive Knowledge Distillation between Text and Speech Pre-trained Models

Learning on a massive amount of speech corpus leads to the recent succes...
research
02/24/2023

Ensemble knowledge distillation of self-supervised speech models

Distilled self-supervised models have shown competitive performance and ...
research
05/28/2023

DPHuBERT: Joint Distillation and Pruning of Self-Supervised Speech Models

Self-supervised learning (SSL) has achieved notable success in many spee...
research
07/01/2022

FitHuBERT: Going Thinner and Deeper for Knowledge Distillation of Speech Self-Supervised Learning

Large-scale speech self-supervised learning (SSL) has emerged to the mai...
research
06/16/2021

Collaborative Training of Acoustic Encoders for Speech Recognition

On-device speech recognition requires training models of different sizes...
research
03/14/2023

MobileVOS: Real-Time Video Object Segmentation Contrastive Learning meets Knowledge Distillation

This paper tackles the problem of semi-supervised video object segmentat...
research
07/01/2021

Knowledge Distillation for Quality Estimation

Quality Estimation (QE) is the task of automatically predicting Machine ...

Please sign up or login with your details

Forgot password? Click here to reset