Multi-task Domain Adaptation for Sequence Tagging

08/09/2016
by   Nanyun Peng, et al.
0

Many domain adaptation approaches rely on learning cross domain shared representations to transfer the knowledge learned in one domain to other domains. Traditional domain adaptation only considers adapting for one task. In this paper, we explore multi-task representation learning under the domain adaptation scenario. We propose a neural network framework that supports domain adaptation for multiple tasks simultaneously, and learns shared representations that better generalize for domain adaptation. We apply the proposed framework to domain adaptation for sequence tagging problems considering two tasks: Chinese word segmentation and named entity recognition. Experiments show that multi-task domain adaptation works better than disjoint domain adaptation for each task, and achieves the state-of-the-art results for both tasks in the social media domain.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/15/2018

Neural Adaptation Layers for Cross-domain Named Entity Recognition

Recent research efforts have shown that neural architectures can be effe...
research
06/08/2021

Predicting the Success of Domain Adaptation in Text Similarity

Transfer learning methods, and in particular domain adaptation, help exp...
research
05/30/2017

Joint auto-encoders: a flexible multi-task learning framework

The incorporation of prior knowledge into learning is essential in achie...
research
08/11/2019

UM-Adapt: Unsupervised Multi-Task Adaptation Using Adversarial Cross-Task Distillation

Aiming towards human-level generalization, there is a need to explore ad...
research
10/15/2021

Crisis Domain Adaptation Using Sequence-to-sequence Transformers

User-generated content (UGC) on social media can act as a key source of ...
research
11/24/2017

Cross-Domain Self-supervised Multi-task Feature Learning using Synthetic Imagery

In human learning, it is common to use multiple sources of information j...
research
11/24/2021

Temporal Effects on Pre-trained Models for Language Processing Tasks

Keeping the performance of language technologies optimal as time passes ...

Please sign up or login with your details

Forgot password? Click here to reset