A Comparison of Neural Models for Word Ordering

08/05/2017
by   Eva Hasler, et al.
0

We compare several language models for the word-ordering task and propose a new bag-to-sequence neural model based on attention-based sequence-to-sequence models. We evaluate the model on a large German WMT data set where it significantly outperforms existing models. We also describe a novel search strategy for LM-based word ordering and report results on the English Penn Treebank. Our best model setup outperforms prior work both in terms of speed and quality.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/15/2022

On the Role of Pre-trained Language Models in Word Ordering: A Case Study with BART

Word ordering is a constrained language generation task taking unordered...
research
06/09/2016

Sequence-to-Sequence Learning as Beam-Search Optimization

Sequence-to-Sequence (seq2seq) modeling has rapidly become an important ...
research
06/01/2020

Online Versus Offline NMT Quality: An In-depth Analysis on English-German and German-English

We conduct in this work an evaluation study comparing offline and online...
research
08/28/2018

The University of Cambridge's Machine Translation Systems for WMT18

The University of Cambridge submission to the WMT18 news translation tas...
research
11/14/2017

Classical Structured Prediction Losses for Sequence to Sequence Learning

There has been much recent work on training neural attention models at t...
research
09/12/2023

Learning to Predict Concept Ordering for Common Sense Generation

Prior work has shown that the ordering in which concepts are shown to a ...
research
09/10/2021

Studying word order through iterative shuffling

As neural language models approach human performance on NLP benchmark ta...

Please sign up or login with your details

Forgot password? Click here to reset