Low Resource Multi-Task Sequence Tagging – Revisiting Dynamic Conditional Random Fields

05/01/2020
by   Jonas Pfeiffer, et al.
0

We compare different models for low resource multi-task sequence tagging that leverage dependencies between label sequences for different tasks. Our analysis is aimed at datasets where each example has labels for multiple tasks. Current approaches use either a separate model for each task or standard multi-task learning to learn shared feature representations. However, these approaches ignore correlations between label sequences, which can provide important information in settings with small training datasets. To analyze which scenarios can profit from modeling dependencies between labels in different tasks, we revisit dynamic conditional random fields (CRFs) and combine them with deep neural networks. We compare single-task, multi-task and dynamic CRF setups for three diverse datasets at both sentence and document levels in English and German low resource scenarios. We show that including silver labels from pretrained part-of-speech taggers as auxiliary tasks can improve performance on downstream tasks. We find that especially in low-resource scenarios, the explicit modeling of inter-dependencies between task predictions outperforms single-task as well as standard multi-task models.

READ FULL TEXT

page 5

page 8

research
08/25/2019

Multi-task Learning for Low-resource Second Language Acquisition Modeling

Second language acquisition (SLA) modeling is to predict whether second ...
research
11/09/2019

Factored Latent-Dynamic Conditional Random Fields for Single and Multi-label Sequence Modeling

Conditional Random Fields (CRF) are frequently applied for labeling and ...
research
05/18/2020

Efficient Image Gallery Representations at Scale Through Multi-Task Learning

Image galleries provide a rich source of diverse information about a pro...
research
06/26/2022

Meta Auxiliary Learning for Low-resource Spoken Language Understanding

Spoken language understanding (SLU) treats automatic speech recognition ...
research
01/10/2020

Learning to Multi-Task Learn for Better Neural Machine Translation

Scarcity of parallel sentence pairs is a major challenge for training hi...
research
07/02/2018

Improving part-of-speech tagging via multi-task learning and character-level word representations

In this paper, we explore the ways to improve POS-tagging using various ...
research
06/14/2021

Constraining Linear-chain CRFs to Regular Languages

In structured prediction, a major challenge for models is to represent t...

Please sign up or login with your details

Forgot password? Click here to reset