Log In Sign Up

PADA: A Prompt-based Autoregressive Approach for Adaptation to Unseen Domains

by   Eyal Ben-David, et al.

Natural Language Processing algorithms have made incredible progress recently, but they still struggle when applied to out-of-distribution examples. In this paper, we address a very challenging and previously underexplored version of this domain adaptation problem. In our setup an algorithm is trained on several source domains, and then applied to examples from an unseen domain that is unknown at training time. Particularly, no examples, labeled or unlabeled, or any other knowledge about the target domain are available to the algorithm at training time. We present PADA: A Prompt-based Autoregressive Domain Adaptation algorithm, based on the T5 model. Given a test example, PADA first generates a unique prompt and then, conditioned on this prompt, labels the example with respect to the NLP task. The prompt is a sequence of unrestricted length, consisting of pre-defined Domain Related Features (DRFs) that characterize each of the source domains. Intuitively, the prompt is a unique signature that maps the test example to the semantic space spanned by the source domains. In experiments with two tasks: Rumour Detection and Multi-Genre Natural Language Inference (MNLI), for a total of 10 multi-source adaptation scenarios, PADA strongly outperforms state-of-the-art approaches and additional strong baselines.


Example-based Hypernetworks for Out-of-Distribution Generalization

While Natural Language Processing (NLP) algorithms keep reaching unprece...

DoCoGen: Domain Counterfactual Generation for Low Resource Domain Adaptation

Natural language processing (NLP) algorithms have become very successful...

IDANI: Inference-time Domain Adaptation via Neuron-level Interventions

Large pre-trained models are usually fine-tuned on downstream task data,...

Domain Adaptation from Scratch

Natural language processing (NLP) algorithms are rapidly improving but o...

Using Language to Extend to Unseen Domains

It is expensive to collect training data for every possible domain that ...

Tree-Structured Semantic Encoder with Knowledge Sharing for Domain Adaptation in Natural Language Generation

Domain adaptation in natural language generation (NLG) remains challengi...

Model Compression for Domain Adaptation through Causal Effect Estimation

Recent improvements in the predictive quality of natural language proces...

Code Repositories


Official code for the paper "PADA: A Prompt-based Autoregressive Approach to Adaptation to Unseen Domains".

view repo