Neural Structural Correspondence Learning for Domain Adaptation

10/05/2016
by   Yftah Ziser, et al.
0

Domain adaptation, adapting models from domains rich in labeled training data to domains poor in such data, is a fundamental NLP challenge. We introduce a neural network model that marries together ideas from two prominent strands of research on domain adaptation through representation learning: structural correspondence learning (SCL, (Blitzer et al., 2006)) and autoencoder neural networks. Particularly, our model is a three-layer neural network that learns to encode the nonpivot features of an input example into a low-dimensional representation, so that the existence of pivot features (features that are prominent in both domains and convey useful information for the NLP task) in the example can be decoded from that representation. The low-dimensional representation is then employed in a learning algorithm for the task. Moreover, we show how to inject pre-trained word embeddings into our model in order to improve generalization across examples with similar pivot features. On the task of cross-domain product sentiment classification (Blitzer et al., 2007), consisting of 12 domain pairs, our model outperforms both the SCL and the marginalized stacked denoising autoencoder (MSDA, (Chen et al., 2012)) methods by 3.77

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/15/2014

Domain-Adversarial Neural Networks

We introduce a new representation learning algorithm suited to the conte...
research
10/07/2020

Low-Resource Domain Adaptation for Compositional Task-Oriented Semantic Parsing

Task-oriented semantic parsing is a critical component of virtual assist...
research
06/16/2020

PERL: Pivot-based Domain Adaptation for Pre-trained Deep Contextualized Embedding Models

Pivot-based neural representation models have lead to significant progre...
research
05/23/2022

Domain Adaptation for Memory-Efficient Dense Retrieval

Dense retrievers encode documents into fixed dimensional embeddings. How...
research
06/16/2017

Self-ensembling for visual domain adaptation

This paper explores the use of self-ensembling for visual domain adaptat...
research
04/26/2022

Efficient Machine Translation Domain Adaptation

Machine translation models struggle when translating out-of-domain text,...
research
03/27/2022

Example-based Hypernetworks for Out-of-Distribution Generalization

While Natural Language Processing (NLP) algorithms keep reaching unprece...

Please sign up or login with your details

Forgot password? Click here to reset