Recurrent babbling: evaluating the acquisition of grammar from limited input data

10/09/2020
by   Ludovica Pannitto, et al.
19

Recurrent Neural Networks (RNNs) have been shown to capture various aspects of syntax from raw linguistic input. In most previous experiments, however, learning happens over unrealistic corpora, which do not reflect the type and amount of data a child would be exposed to. This paper remedies this state of affairs by training a Long Short-Term Memory network (LSTM) over a realistically sized subset of child-directed input. The behaviour of the network is analysed over time using a novel methodology which consists in quantifying the level of grammatical abstraction in the model's generated output (its "babbling"), compared to the language it has been exposed to. We show that the LSTM indeed abstracts new structuresas learning proceeds.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/12/2019

Understanding LSTM – a tutorial into Long Short-Term Memory Recurrent Neural Networks

Long Short-Term Memory Recurrent Neural Networks (LSTM-RNN) are one of t...
research
04/05/2019

Short note on the behavior of recurrent neural network for noisy dynamical system

The behavior of recurrent neural network for the data-driven simulation ...
research
11/19/2018

Compressing Recurrent Neural Networks with Tensor Ring for Action Recognition

Recurrent Neural Networks (RNNs) and their variants, such as Long-Short ...
research
08/09/2017

Tikhonov Regularization for Long Short-Term Memory Networks

It is a well-known fact that adding noise to the input data often improv...
research
09/01/2020

Analysis of memory in LSTM-RNNs for source separation

Long short-term memory recurrent neural networks (LSTM-RNNs) are conside...
research
11/02/2018

On Evaluating the Generalization of LSTM Models in Formal Languages

Recurrent Neural Networks (RNNs) are theoretically Turing-complete and e...
research
06/09/2019

LSTM Networks Can Perform Dynamic Counting

In this paper, we systematically assess the ability of standard recurren...

Please sign up or login with your details

Forgot password? Click here to reset