Adapt𝒪r: Objective-Centric Adaptation Framework for Language Models

03/08/2022
by   Michal Štefánik, et al.
0

Progress in natural language processing research is catalyzed by the possibilities given by the widespread software frameworks. This paper introduces Adaptor library that transposes the traditional model-centric approach composed of pre-training + fine-tuning steps to objective-centric approach, composing the training process by applications of selected objectives. We survey research directions that can benefit from enhanced objective-centric experimentation in multitask training, custom objectives development, dynamic training curricula, or domain adaptation. Adaptor aims to ease reproducibility of these research directions in practice. Finally, we demonstrate the practical applicability of Adaptor in selected unsupervised domain adaptation scenarios.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/01/2021

DILBERT: Customized Pre-Training for Domain Adaptation withCategory Shift, with an Application to Aspect Extraction

The rise of pre-trained language models has yielded substantial progress...
research
11/01/2021

Unsupervised Domain Adaptation with Adapter

Unsupervised domain adaptation (UDA) with pre-trained language models (P...
research
01/03/2021

Adversarial Unsupervised Domain Adaptation for Harmonic-Percussive Source Separation

This paper addresses the problem of domain adaptation for the task of mu...
research
02/10/2023

Key Design Choices for Double-Transfer in Source-Free Unsupervised Domain Adaptation

Fine-tuning and Domain Adaptation emerged as effective strategies for ef...
research
06/16/2022

Methods for Estimating and Improving Robustness of Language Models

Despite their outstanding performance, large language models (LLMs) suff...
research
08/04/2022

Vision-Centric BEV Perception: A Survey

Vision-centric BEV perception has recently received increased attention ...
research
09/14/2020

The Hardware Lottery

Hardware, systems and algorithms research communities have historically ...

Please sign up or login with your details

Forgot password? Click here to reset