Adapting to the Long Tail: A Meta-Analysis of Transfer Learning Research for Language Understanding Tasks

11/02/2021
by   Aakanksha Naik, et al.
0

Natural language understanding (NLU) has made massive progress driven by large benchmarks, paired with research on transfer learning to broaden its impact. Benchmarks are dominated by a small set of frequent phenomena, leaving a long tail of infrequent phenomena underrepresented. In this work, we reflect on the question: have transfer learning methods sufficiently addressed performance of benchmark-trained models on the long tail? Since benchmarks do not list included/excluded phenomena, we conceptualize the long tail using macro-level dimensions such as underrepresented genres, topics, etc. We assess trends in transfer learning research through a qualitative meta-analysis of 100 representative papers on transfer learning for NLU. Our analysis asks three questions: (i) Which long tail dimensions do transfer learning studies target? (ii) Which properties help adaptation methods improve performance on the long tail? (iii) Which methodological gaps have greatest negative impact on long tail performance? Our answers to these questions highlight major avenues for future research in transfer learning for the long tail. Lastly, we present a case study comparing the performance of various adaptation methods on clinical narratives to show how systematically conducted meta-experiments can provide insights that enable us to make progress along these future avenues.

READ FULL TEXT

page 2

page 6

research
11/05/2020

Language Model is All You Need: Natural Language Understanding as Question Answering

Different flavors of transfer learning have shown tremendous impact in a...
research
04/20/2021

X-METRA-ADA: Cross-lingual Meta-Transfer Learning Adaptation to Natural Language Understanding and Question Answering

Multilingual models, such as M-BERT and XLM-R, have gained increasing po...
research
05/02/2019

SuperGLUE: A Stickier Benchmark for General-Purpose Language Understanding Systems

In the last year, new models and methods for pretraining and transfer le...
research
06/03/2017

Concept Transfer Learning for Adaptive Language Understanding

Semantic transfer is an important problem of the language understanding ...
research
03/16/2022

Less is More: Summary of Long Instructions is Better for Program Synthesis

Despite the success of large pre-trained language models (LMs) such as C...
research
04/23/2018

Dropping Networks for Transfer Learning

In natural language understanding, many challenges require learning rela...
research
07/07/2021

"Are you sure?": Preliminary Insights from Scaling Product Comparisons to Multiple Shops

Large eCommerce players introduced comparison tables as a new type of re...

Please sign up or login with your details

Forgot password? Click here to reset