Life after BERT: What do Other Muppets Understand about Language?

05/21/2022
by   Vladislav Lialin, et al.
0

Existing pre-trained transformer analysis works usually focus only on one or two model families at a time, overlooking the variability of the architecture and pre-training objectives. In our work, we utilize the oLMpics benchmark and psycholinguistic probing datasets for a diverse set of 29 models including T5, BART, and ALBERT. Additionally, we adapt the oLMpics zero-shot setup for autoregressive models and evaluate GPT networks of different sizes. Our findings show that none of these models can resolve compositional questions in a zero-shot fashion, suggesting that this skill is not learnable using existing pre-training objectives. Furthermore, we find that global model decisions such as architecture, directionality, size of the dataset, and pre-training objective are not predictive of a model's linguistic capabilities.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/15/2021

LiT: Zero-Shot Transfer with Locked-image Text Tuning

This paper presents contrastive-tuning, a simple method employing contra...
research
09/25/2022

Collaboration of Pre-trained Models Makes Better Few-shot Learner

Few-shot classification requires deep neural networks to learn generaliz...
research
05/11/2021

Benchmarking down-scaled (not so large) pre-trained language models

Large Transformer-based language models are pre-trained on corpora of va...
research
06/07/2023

UniBoost: Unsupervised Unimodal Pre-training for Boosting Zero-shot Vision-Language Tasks

Large-scale joint training of multimodal models, e.g., CLIP, have demons...
research
05/24/2022

On the Role of Bidirectionality in Language Model Pre-Training

Prior work on language model pre-training has explored different archite...
research
03/06/2023

Enhancing Activity Prediction Models in Drug Discovery with the Ability to Understand Human Language

Activity and property prediction models are the central workhorses in dr...

Please sign up or login with your details

Forgot password? Click here to reset