Word Ordering Without Syntax

04/28/2016
by   Allen Schmaltz, et al.
0

Recent work on word ordering has argued that syntactic structure is important, or even required, for effectively recovering the order of a sentence. We find that, in fact, an n-gram language model with a simple heuristic gives strong results on this task. Furthermore, we show that a long short-term memory (LSTM) language model is even more effective at recovering order, with our basic model outperforming a state-of-the-art syntactic model by 11.5 BLEU points. Additional data and larger beams yield further gains, at the expense of training and search time.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/28/2018

Disfluency Detection using a Noisy Channel Model and a Deep Neural Language Model

This paper presents a model for disfluency detection in spontaneous spee...
research
10/23/2018

Neural Transition-based Syntactic Linearization

The task of linearization is to find a grammatical order given a set of ...
research
11/11/2015

Larger-Context Language Modelling

In this work, we propose a novel method to incorporate corpus-level disc...
research
08/27/2018

Targeted Syntactic Evaluation of Language Models

We present a dataset for evaluating the grammaticality of the prediction...
research
04/12/2019

IIT (BHU) Varanasi at MSR-SRST 2018: A Language Model Based Approach for Natural Language Generation

This paper describes our submission system for the Shallow Track of Surf...
research
10/01/2020

How LSTM Encodes Syntax: Exploring Context Vectors and Semi-Quantization on Natural Text

Long Short-Term Memory recurrent neural network (LSTM) is widely used an...
research
10/14/2020

From Language to Language-ish: How Brain-Like is an LSTM's Representation of Nonsensical Language Stimuli?

The representations generated by many models of language (word embedding...

Please sign up or login with your details

Forgot password? Click here to reset