DeepAI
Log In Sign Up

PERL: Pivot-based Domain Adaptation for Pre-trained Deep Contextualized Embedding Models

06/16/2020
by   Eyal Ben-David, et al.
0

Pivot-based neural representation models have lead to significant progress in domain adaptation for NLP. However, previous works that follow this approach utilize only labeled data from the source domain and unlabeled data from the source and target domains, but neglect to incorporate massive unlabeled corpora that are not necessarily drawn from these domains. To alleviate this, we propose PERL: A representation learning model that extends contextualized word embedding models such as BERT with pivot-based fine-tuning. PERL outperforms strong baselines across 22 sentiment classification domain adaptation setups, improves in-domain model performance, yields effective reduced-size models and increases model stability.

READ FULL TEXT
04/30/2020

Vocabulary Adaptation for Distant Domain Adaptation in Neural Machine Translation

Neural machine translation (NMT) models do not work well in domains diff...
09/26/2014

Unsupervised Domain Adaptation by Backpropagation

Top-performing deep architectures are trained on massive amounts of labe...
10/06/2022

Improving the Sample Efficiency of Prompt Tuning with Domain Adaptation

Prompt tuning, or the conditioning of a frozen pretrained language model...
10/05/2016

Neural Structural Correspondence Learning for Domain Adaptation

Domain adaptation, adapting models from domains rich in labeled training...
04/26/2022

Modular Domain Adaptation

Off-the-shelf models are widely used by computational social science res...
06/20/2022

Domain-Adaptive Text Classification with Structured Knowledge from Unlabeled Data

Domain adaptive text classification is a challenging problem for the lar...