Char2char Generation with Reranking for the E2E NLG Challenge

11/04/2018
by   Shubham Agarwal, et al.
0

This paper describes our submission to the E2E NLG Challenge. Recently, neural seq2seq approaches have become mainstream in NLG, often resorting to pre- (respectively post-) processing delexicalization (relexicalization) steps at the word-level to handle rare words. By contrast, we train a simple character level seq2seq model, which requires no pre/post-processing (delexicalization, tokenization or even lowercasing), with surprisingly good results. For further improvement, we explore two re-ranking approaches for scoring candidates. We also introduce a synthetic dataset creation procedure, which opens up a new way of creating artificial datasets for Natural Language Generation.

READ FULL TEXT
research
12/03/2021

The Influence of Data Pre-processing and Post-processing on Long Document Summarization

Long document summarization is an important and hard task in the field o...
research
04/08/2021

Lone Pine at SemEval-2021 Task 5: Fine-Grained Detection of Hate Speech Using BERToxic

This paper describes our approach to the Toxic Spans Detection problem (...
research
04/30/2021

Word-Level Alignment of Paper Documents with their Electronic Full-Text Counterparts

We describe a simple procedure for the automatic creation of word-level ...
research
07/03/2023

Estimating Post-OCR Denoising Complexity on Numerical Texts

Post-OCR processing has significantly improved over the past few years. ...
research
08/20/2018

Post-Processing of Word Representations via Variance Normalization and Dynamic Embedding

Although embedded vector representations of words offer impressive perfo...
research
03/23/2021

Attention-based neural re-ranking approach for next city in trip recommendations

This paper describes an approach to solving the next destination city re...
research
03/12/2021

A Simple Post-Processing Technique for Improving Readability Assessment of Texts using Word Mover's Distance

Assessing the proper difficulty levels of reading materials or texts in ...

Please sign up or login with your details

Forgot password? Click here to reset