Natural Language Generation by Hierarchical Decoding with Linguistic Patterns

08/08/2018
by   Shang-Yu Su, et al.
0

Natural language generation (NLG) is a critical component in spoken dialogue systems. Classic NLG can be divided into two phases: (1) sentence planning: deciding on the overall sentence structure, (2) surface realization: determining specific word forms and flattening the sentence structure into a string. Many simple NLG models are based on recurrent neural networks (RNN) and sequence-to-sequence (seq2seq) model, which basically contains an encoder-decoder structure; these NLG models generate sentences from scratch by jointly optimizing sentence planning and surface realization using a simple cross entropy loss training criterion. However, the simple encoder-decoder architecture usually suffers from generating complex and long sentences, because the decoder has to learn all grammar and diction knowledge. This paper introduces a hierarchical decoding NLG model based on linguistic patterns in different levels, and shows that the proposed method outperforms the traditional one with a smaller model size. Furthermore, the design of the hierarchical decoding is flexible and easily-extensible in various NLG systems.

READ FULL TEXT
research
09/19/2018

Investigating Linguistic Pattern Ordering in Hierarchical Natural Language Generation

Natural language generation (NLG) is a critical component in spoken dial...
research
06/01/2017

Natural Language Generation for Spoken Dialogue System using RNN Encoder-Decoder Networks

Natural language generation (NLG) is a critical component in a spoken di...
research
06/21/2017

Neural-based Natural Language Generation in Dialogue using RNN Encoder-Decoder with Semantic Aggregation

Natural language generation (NLG) is an important component in spoken di...
research
08/07/2015

Semantically Conditioned LSTM-based Natural Language Generation for Spoken Dialogue Systems

Natural language generation (NLG) is a critical component of spoken dial...
research
09/18/2020

Hierarchical GPT with Congruent Transformers for Multi-Sentence Language Models

We report a GPT-based multi-sentence language model for dialogue generat...
research
01/12/2021

Transforming Multi-Conditioned Generation from Meaning Representation

In task-oriented conversation systems, natural language generation syste...
research
09/08/2018

Simplified Hierarchical Recurrent Encoder-Decoder for Building End-To-End Dialogue Systems

As a generative model for building end-to-end dialogue systems, Hierarch...

Please sign up or login with your details

Forgot password? Click here to reset