Bootstrapping NLU Models with Multi-task Learning

11/15/2019
by   Shubham Kapoor, et al.
0

Bootstrapping natural language understanding (NLU) systems with minimal training data is a fundamental challenge of extending digital assistants like Alexa and Siri to a new language. A common approach that is adapted in digital assistants when responding to a user query is to process the input in a pipeline manner where the first task is to predict the domain, followed by the inference of intent and slots. However, this cascaded approach instigates error propagation and prevents information sharing among these tasks. Further, the use of words as the atomic units of meaning as done in many studies might lead to coverage problems for morphologically rich languages such as German and French when data is limited. We address these issues by introducing a character-level unified neural architecture for joint modeling of the domain, intent, and slot classification. We compose word-embeddings from characters and jointly optimize all classification tasks via multi-task learning. In our results, we show that the proposed architecture is an optimal choice for bootstrapping NLU systems in low-resource settings thus saving time, cost and human effort.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/16/2018

OneNet: Joint Domain, Intent, Slot Prediction for Spoken Language Understanding

In practice, most spoken language understanding systems process user inp...
research
05/15/2021

From Masked Language Modeling to Translation: Non-English Auxiliary Tasks Improve Zero-shot Spoken Language Understanding

The lack of publicly available evaluation data for low-resource language...
research
08/01/2018

Seq2Seq and Multi-Task Learning for joint intent and content extraction for domain specific interpreters

This study evaluates the performances of an LSTM network for detecting a...
research
04/01/2016

Domain Adaptation of Recurrent Neural Networks for Natural Language Understanding

The goal of this paper is to use multi-task learning to efficiently scal...
research
01/12/2021

A character representation enhanced on-device Intent Classification

Intent classification is an important task in natural language understan...
research
04/30/2021

Hierarchical Modeling for Out-of-Scope Domain and Intent Classification

User queries for a real-world dialog system may sometimes fall outside t...
research
10/25/2016

Improving historical spelling normalization with bi-directional LSTMs and multi-task learning

Natural-language processing of historical documents is complicated by th...

Please sign up or login with your details

Forgot password? Click here to reset