Language Model is All You Need: Natural Language Understanding as Question Answering

by   Mahdi Namazifar, et al.

Different flavors of transfer learning have shown tremendous impact in advancing research and applications of machine learning. In this work we study the use of a specific family of transfer learning, where the target domain is mapped to the source domain. Specifically we map Natural Language Understanding (NLU) problems to QuestionAnswering (QA) problems and we show that in low data regimes this approach offers significant improvements compared to other approaches to NLU. Moreover we show that these gains could be increased through sequential transfer learning across NLU problems from different domains. We show that our approach could reduce the amount of required data for the same performance by up to a factor of 10.


page 1

page 2

page 3

page 4


DoubleTransfer at MEDIQA 2019: Multi-Source Transfer Learning for Natural Language Understanding in the Medical Domain

This paper describes our competing system to enter the MEDIQA-2019 compe...

Modelling Domain Relationships for Transfer Learning on Retrieval-based Question Answering Systems in E-commerce

In this paper, we study transfer learning for the PI and NLI problems, a...

Adapting to the Long Tail: A Meta-Analysis of Transfer Learning Research for Language Understanding Tasks

Natural language understanding (NLU) has made massive progress driven by...

A Survey on Machine Learning Techniques for Auto Labeling of Video, Audio, and Text Data

Machine learning has been utilized to perform tasks in many different do...

Concept Transfer Learning for Adaptive Language Understanding

Semantic transfer is an important problem of the language understanding ...

Dropping Networks for Transfer Learning

In natural language understanding, many challenges require learning rela...

A Language for Function Signature Representations

Recent work by (Richardson and Kuhn, 2017a,b; Richardson et al., 2018) l...