Finding the Answers with Definition Models

09/01/2018
by   Jack Parry, et al.
0

Inspired by a previous attempt to answer crossword questions using neural networks (Hill, Cho, Korhonen, & Bengio, 2015), this dissertation implements extensions to improve the performance of this existing definition model on the task of answering crossword questions. A discussion and evaluation of the original implementation finds that there are some ways in which the recurrent neural model could be extended. Insights from related fields neural language modeling and neural machine translation provide the justification and means required for these extensions. Two extensions are applied to the LSTM encoder, first taking the average of LSTM states across the sequence and secondly using a bidirectional LSTM, both implementations serve to improve model performance on a definitions and crossword test set. In order to improve performance on crossword questions, the training data is increased to include crossword questions and answers, and this serves to improve results on definitions as well as crossword questions. The final experiments are conducted using sub-word unit segmentation, first on the source side and then later preliminary experimentation is conducted to facilitate character-level output. Initially, an exact reproduction of the baseline results proves unsuccessful. Despite this, the extensions improve performance, allowing the definition model to surpass the performance of the recurrent neural network variants of the previous work (Hill, et al., 2015).

READ FULL TEXT
research
03/09/2018

The Importance of Being Recurrent for Modeling Hierarchical Structure

Recent work has shown that recurrent neural networks (RNNs) can implicit...
research
12/01/2015

LSTM Neural Reordering Feature for Statistical Machine Translation

Artificial neural networks are powerful models, which have been widely a...
research
11/05/2016

Quasi-Recurrent Neural Networks

Recurrent neural networks are a powerful tool for modeling sequential da...
research
08/03/2017

Revisiting Activation Regularization for Language RNNs

Recurrent neural networks (RNNs) serve as a fundamental building block f...
research
08/29/2016

Machine Comprehension Using Match-LSTM and Answer Pointer

Machine comprehension of text is an important problem in natural languag...
research
07/12/2018

Improving on Q & A Recurrent Neural Networks Using Noun-Tagging

Often, more time is spent on finding a model that works well, rather tha...
research
09/08/2017

Training RNNs as Fast as CNNs

Common recurrent neural network architectures scale poorly due to the in...

Please sign up or login with your details

Forgot password? Click here to reset