Recurrent Neural Networks in Linguistic Theory: Revisiting Pinker and Prince (1988) and the Past Tense Debate

07/12/2018
by   Christo Kirov, et al.
0

Can advances in NLP help advance cognitive modeling? We examine the role of artificial neural networks, the current state of the art in many common NLP tasks, by returning to a classic case study. In 1986, Rumelhart and McClelland famously introduced a neural architecture that learned to transduce English verb stems to their past tense forms. Shortly thereafter, Pinker & Prince (1988) presented a comprehensive rebuttal of many of Rumelhart and McClelland's claims. Much of the force of their attack centered on the empirical inadequacy of the Rumelhart and McClelland (1986) model. Today, however, that model is severely outmoded. We show that the Encoder-Decoder network architectures used in modern NLP systems obviate most of Pinker and Prince's criticisms without requiring any simplication of the past tense mapping problem. We suggest that the empirical performance of modern networks warrants a re-examination of their utility in linguistic and cognitive modeling.

READ FULL TEXT

Authors

page 1

page 2

page 3

page 4

06/04/2019

Are we there yet? Encoder-decoder neural networks as cognitive models of English past tense inflection

The cognitive mechanisms needed to account for the English past tense ha...
05/18/2020

Inflecting when there's no majority: Limitations of encoder-decoder neural networks as cognitive models for German plurals

Can artificial neural networks learn to represent inflectional morpholog...
09/18/2019

Memory-Augmented Neural Networks for Machine Translation

Memory-augmented neural networks (MANNs) have been shown to outperform o...
10/18/2018

Analyzing and Interpreting Convolutional Neural Networks in NLP

Convolutional neural networks have been successfully applied to various ...
03/29/2022

Visualizing the Relationship Between Encoded Linguistic Information and Task Performance

Probing is popular to analyze whether linguistic information can be capt...
05/13/2020

Machine Reading Comprehension: The Role of Contextualized Language Models and Beyond

Machine reading comprehension (MRC) aims to teach machines to read and c...
07/04/2015

Modeling the Mind: A brief review

The brain is a powerful tool used to achieve amazing feats. There have b...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.