Once-for-All Sequence Compression for Self-Supervised Speech Models

11/04/2022
by   Hsuan-Jui Chen, et al.
0

The sequence length along the time axis is often the dominant factor of the computational cost of self-supervised speech models. Works have been proposed to reduce the sequence length for lowering the computational cost. However, different downstream tasks have different tolerance of sequence compressing, so a model that produces a fixed compressing rate may not fit all tasks. In this work, we introduce a once-for-all (OFA) sequence compression framework for self-supervised speech models that supports a continuous range of compressing rates. The framework is evaluated on various tasks, showing marginal degradation compared to the fixed compressing rate variants with a smooth performance-efficiency trade-off. We further explore adaptive compressing rate learning, demonstrating the ability to select task-specific preferred frame periods without needing a grid search.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/13/2022

On Compressing Sequences for Self-Supervised Speech Models

Compressing self-supervised models has become increasingly necessary, as...
research
02/27/2023

Structured Pruning of Self-Supervised Pre-trained Models for Speech Recognition and Understanding

Self-supervised speech representation learning (SSL) has shown to be eff...
research
05/30/2023

MiniSUPERB: Lightweight Benchmark for Self-supervised Speech Models

Self-supervised learning (SSL) is a popular research topic in speech pro...
research
11/17/2022

Compressing Transformer-based self-supervised models for speech processing

Despite the success of Transformers in self-supervised learning with app...
research
09/30/2022

Match to Win: Analysing Sequences Lengths for Efficient Self-supervised Learning in Speech and Audio

Self-supervised learning (SSL) has proven vital in speech and audio-rela...
research
09/11/2023

Towards generalisable and calibrated synthetic speech detection with self-supervised representations

Generalisation – the ability of a model to perform well on unseen data –...
research
08/18/2023

Data Compression and Inference in Cosmology with Self-Supervised Machine Learning

The influx of massive amounts of data from current and upcoming cosmolog...

Please sign up or login with your details

Forgot password? Click here to reset