Improving Language Plasticity via Pretraining with Active Forgetting

07/03/2023
by   Yihong Chen, et al.
0

Pretrained language models (PLMs) are today the primary model for natural language processing. Despite their impressive downstream performance, it can be difficult to apply PLMs to new languages, a barrier to making their capabilities universally accessible. While prior work has shown it possible to address this issue by learning a new embedding layer for the new language, doing so is both data and compute inefficient. We propose to use an active forgetting mechanism during pretraining, as a simple way of creating PLMs that can quickly adapt to new languages. Concretely, by resetting the embedding layer every K updates during pretraining, we encourage the PLM to improve its ability of learning new embeddings within a limited number of updates, similar to a meta-learning effect. Experiments with RoBERTa show that models pretrained with our forgetting mechanism not only demonstrate faster convergence during language adaptation but also outperform standard ones in a low-data regime, particularly for languages that are distant from English.

READ FULL TEXT

page 2

page 3

research
04/27/2020

Recall and Learn: Fine-tuning Deep Pretrained Language Models with Less Forgetting

Deep pretrained language models have achieved great success in the way o...
research
11/10/2019

CamemBERT: a Tasty French Language Model

Pretrained language models are now ubiquitous in Natural Language Proces...
research
06/06/2021

Meta-learning for downstream aware and agnostic pretraining

Neural network pretraining is gaining attention due to its outstanding p...
research
05/20/2023

Lifelong Language Pretraining with Distribution-Specialized Experts

Pretraining on a large-scale corpus has become a standard method to buil...
research
12/20/2022

Mini-Model Adaptation: Efficiently Extending Pretrained Models to New Languages via Aligned Shallow Training

Prior work has shown that it is possible to expand pretrained Masked Lan...
research
10/12/2021

OpenHands: Making Sign Language Recognition Accessible with Pose-based Pretrained Models across Languages

AI technologies for Natural Languages have made tremendous progress rece...
research
07/18/2023

UniTabE: Pretraining a Unified Tabular Encoder for Heterogeneous Tabular Data

Recent advancements in Natural Language Processing (NLP) have witnessed ...

Please sign up or login with your details

Forgot password? Click here to reset