Multi-task learning to improve natural language understanding

12/17/2018
by   Stefan Constantin, et al.
0

Recently advancements in sequence-to-sequence neural network architectures have led to an improved natural language understanding. When building a neural network-based Natural Language Understanding component, one main challenge is to collect enough training data. The generation of a synthetic dataset is an inexpensive and quick way to collect data. Since this data often has less variety than real natural language, neural networks often have problems to generalize to unseen utterances during testing. In this work, we address this challenge by using multi-task learning. We train out-of-domain real data alongside in-domain synthetic data to improve natural language understanding. We evaluate this approach in the domain of airline travel information with two synthetic datasets. As out-of-domain real data, we test two datasets based on the subtitles of movies and series. By using an attention-based encoder-decoder model, we were able to improve the F1-score over strong baselines from 80.76 to 84.98

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/13/2020

The Unstoppable Rise of Computational Linguistics in Deep Learning

In this paper, we trace the history of neural networks applied to natura...
research
04/01/2016

Domain Adaptation of Recurrent Neural Networks for Natural Language Understanding

The goal of this paper is to use multi-task learning to efficiently scal...
research
07/03/2019

Multi-Task Networks With Universe, Group, and Task Feature Learning

We present methods for multi-task learning that take advantage of natura...
research
09/09/2019

Out-of-domain Detection for Natural Language Understanding in Dialog Systems

In natural language understanding components, detecting out-of-domain (O...
research
04/28/2020

Unnatural Language Processing: Bridging the Gap Between Synthetic and Natural Language Data

Large, human-annotated datasets are central to the development of natura...
research
09/12/2016

Knowledge as a Teacher: Knowledge-Guided Structural Attention Networks

Natural language understanding (NLU) is a core component of a spoken dia...
research
06/21/2022

Why Robust Natural Language Understanding is a Challenge

With the proliferation of Deep Machine Learning into real-life applicati...

Please sign up or login with your details

Forgot password? Click here to reset