BLOOM+1: Adding Language Support to BLOOM for Zero-Shot Prompting

12/19/2022
by   Zheng Xin Yong, et al.
16

The BLOOM model is a large open-source multilingual language model capable of zero-shot learning, but its pretraining was limited to 46 languages. To improve its zero-shot performance on unseen languages, it is desirable to adapt BLOOM, but previous works have only explored adapting small language models. In this work, we apply existing language adaptation strategies to BLOOM and benchmark its zero-shot prompting performance on eight new languages. We find language adaptation to be effective at improving zero-shot performance in new languages. Surprisingly, adapter-based finetuning is more effective than continued pretraining for large models. In addition, we discover that prompting performance is not significantly affected by language specifics, such as the writing system. It is primarily determined by the size of the language adaptation data. We also add new languages to BLOOMZ, which is a multitask finetuned version of BLOOM capable of following task instructions zero-shot. We find including a new language in the multitask fine-tuning mixture to be the most effective method to teach BLOOMZ a new language. We conclude that with sufficient training data language adaptation can generalize well to diverse languages. Our code is available at <https://github.com/bigscience-workshop/multilingual-modeling/>.

READ FULL TEXT
research
11/03/2022

Crosslingual Generalization through Multitask Finetuning

Multitask prompted finetuning (MTF) has been shown to help large languag...
research
05/12/2022

Multi Task Learning For Zero Shot Performance Prediction of Multilingual Models

Massively Multilingual Transformer based Language Models have been obser...
research
08/06/2021

Towards Zero-shot Language Modeling

Can we construct a neural model that is inductively biased towards learn...
research
11/14/2022

AdaptKeyBERT: An Attention-Based approach towards Few-Shot Zero-Shot Domain Adaptation of KeyBERT

Keyword extraction has been an important topic for modern natural langua...
research
03/21/2022

Match the Script, Adapt if Multilingual: Analyzing the Effect of Multilingual Pretraining on Cross-lingual Transferability

Pretrained multilingual models enable zero-shot learning even for unseen...
research
02/26/2020

Towards Zero-shot Learning for Automatic Phonemic Transcription

Automatic phonemic transcription tools are useful for low-resource langu...
research
06/24/2022

DetIE: Multilingual Open Information Extraction Inspired by Object Detection

State of the art neural methods for open information extraction (OpenIE)...

Please sign up or login with your details

Forgot password? Click here to reset