DeepAI
Log In Sign Up

Dual Inference for Improving Language Understanding and Generation

10/08/2020
by   Yung-Sung Chuang, et al.
0

Natural language understanding (NLU) and Natural language generation (NLG) tasks hold a strong dual relationship, where NLU aims at predicting semantic labels based on natural language utterances and NLG does the opposite. The prior work mainly focused on exploiting the duality in model training in order to obtain the models with better performance. However, regarding the fast-growing scale of models in the current NLP area, sometimes we may have difficulty retraining whole NLU and NLG models. To better address the issue, this paper proposes to leverage the duality in the inference stage without the need of retraining. The experiments on three benchmark datasets demonstrate the effectiveness of the proposed method in both NLU and NLG, providing the great potential of practical usage.

READ FULL TEXT

page 1

page 2

page 3

page 4

05/15/2019

Dual Supervised Learning for Natural Language Understanding and Generation

Natural language understanding (NLU) and natural language generation (NL...
04/30/2020

Towards Unsupervised Language Understanding and Generation by Joint Dual Learning

In modular dialogue systems, natural language understanding (NLU) and na...
04/07/2020

KorNLI and KorSTS: New Benchmark Datasets for Korean Natural Language Understanding

Natural language inference (NLI) and semantic textual similarity (STS) a...
06/03/2019

Jointly Learning Semantic Parser and Natural Language Generator via Dual Information Maximization

Semantic parsing aims to transform natural language (NL) utterances into...
10/11/2021

Calibrate your listeners! Robust communication-based training for pragmatic speakers

To be good conversational partners, natural language processing (NLP) sy...