A Generative Model for Joint Natural Language Understanding and Generation

06/12/2020
by   Bo-Hsiang Tseng, et al.
0

Natural language understanding (NLU) and natural language generation (NLG) are two fundamental and related tasks in building task-oriented dialogue systems with opposite objectives: NLU tackles the transformation from natural language to formal representations, whereas NLG does the reverse. A key to success in either task is parallel training data which is expensive to obtain at a large scale. In this work, we propose a generative model which couples NLU and NLG through a shared latent variable. This approach allows us to explore both spaces of natural language and formal representations, and facilitates information sharing through the latent space to eventually benefit NLU and NLG. Our model achieves state-of-the-art performance on two dialogue datasets with both flat and tree-structured formal representations. We also show that the model can be trained in a semi-supervised fashion by utilising unlabelled data to boost its performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

09/28/2020

DialoGLUE: A Natural Language Understanding Benchmark for Task-Oriented Dialogue

A long-standing goal of task-oriented dialogue research is the ability t...
11/23/2018

Natural language understanding for task oriented dialog in the biomedical domain in a low resources context

In the biomedical domain, the lack of sharable datasets often limit the ...
04/30/2020

Towards Unsupervised Language Understanding and Generation by Joint Dual Learning

In modular dialogue systems, natural language understanding (NLU) and na...
09/29/2019

Semi-Supervised Neural Text Generation by Joint Learning of Natural Language Generation and Natural Language Understanding Models

In Natural Language Generation (NLG), End-to-End (E2E) systems trained t...
06/05/2020

Accelerating Natural Language Understanding in Task-Oriented Dialog

Task-oriented dialog models typically leverage complex neural architectu...
06/03/2019

Jointly Learning Semantic Parser and Natural Language Generator via Dual Information Maximization

Semantic parsing aims to transform natural language (NL) utterances into...
07/30/2021

Perceiver IO: A General Architecture for Structured Inputs Outputs

The recently-proposed Perceiver model obtains good results on several do...