Exploiting Redundancy in Pre-trained Language Models for Efficient Transfer Learning

04/08/2020
by   Fahim Dalvi, et al.
0

Large pre-trained contextual word representations have transformed the field of natural language processing, obtaining impressive results on a wide range of tasks. However, as models increase in size, computational limitations make them impractical for researchers and practitioners alike. We hypothesize that contextual representations have both intrinsic and task-specific redundancies. We propose a novel feature selection method, which takes advantage of these redundancies to reduce the size of the pre-trained features. In a comprehensive evaluation on two pre-trained models, BERT and XLNet, using a diverse suite of sequence labeling and sequence classification tasks, our method reduces the feature set down to 1–7 of the performance.

READ FULL TEXT

page 7

page 9

research
11/10/2021

Prune Once for All: Sparse Pre-Trained Language Models

Transformer-based language models are applied to a wide range of applica...
research
06/10/2022

Feature-informed Embedding Space Regularization For Audio Classification

Feature representations derived from models pre-trained on large-scale d...
research
12/08/2021

The Effect of Model Size on Worst-Group Generalization

Overparameterization is shown to result in poor test accuracy on rare su...
research
07/20/2023

A Dataset and Strong Baselines for Classification of Czech News Texts

Pre-trained models for Czech Natural Language Processing are often evalu...
research
10/14/2021

Plug-Tagger: A Pluggable Sequence Labeling Framework Using Language Models

Plug-and-play functionality allows deep learning models to adapt well to...
research
12/19/2014

Score Function Features for Discriminative Learning

Feature learning forms the cornerstone for tackling challenging learning...
research
03/27/2023

Typhoon: Towards an Effective Task-Specific Masking Strategy for Pre-trained Language Models

Through exploiting a high level of parallelism enabled by graphics proce...

Please sign up or login with your details

Forgot password? Click here to reset