Exploring Low-Cost Transformer Model Compression for Large-Scale Commercial Reply Suggestions

11/27/2021
by   Vaishnavi Shrivastava, et al.
5

Fine-tuning pre-trained language models improves the quality of commercial reply suggestion systems, but at the cost of unsustainable training times. Popular training time reduction approaches are resource intensive, thus we explore low-cost model compression techniques like Layer Dropping and Layer Freezing. We demonstrate the efficacy of these techniques in large-data scenarios, enabling the training time reduction for a commercial email reply suggestion system by 42 engagement. We further study the robustness of these techniques to pre-trained model and dataset size ablation, and share several insights and recommendations for commercial applications.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/08/2020

Exploring Versatile Generative Language Model Via Parameter-Efficient Transfer Learning

Fine-tuning pre-trained generative language models to down-stream langua...
research
10/08/2022

AlphaTuning: Quantization-Aware Parameter-Efficient Adaptation of Large-Scale Pre-Trained Language Models

There are growing interests in adapting large-scale language models usin...
research
12/31/2020

EarlyBERT: Efficient BERT Training via Early-bird Lottery Tickets

Deep, heavily overparameterized language models such as BERT, XLNet and ...
research
08/28/2020

Predicting Training Time Without Training

We tackle the problem of predicting the number of optimization steps tha...
research
05/31/2023

Exploring Lottery Prompts for Pre-trained Language Models

Consistently scaling pre-trained language models (PLMs) imposes substant...
research
04/25/2022

On-demand compute reduction with stochastic wav2vec 2.0

Squeeze and Efficient Wav2vec (SEW) is a recently proposed architecture ...
research
01/06/2023

Does compressing activations help model parallel training?

Large-scale Transformer models are known for their exceptional performan...

Please sign up or login with your details

Forgot password? Click here to reset