Using Regular Languages to Explore the Representational Capacity of Recurrent Neural Architectures

08/15/2018
by   Abhijit Mahalunkar, et al.
0

The presence of Long Distance Dependencies (LDDs) in sequential data poses significant challenges for computational models. Various recurrent neural architectures have been designed to mitigate this issue. In order to test these state-of-the-art architectures, there is growing need for rich benchmarking datasets. However, one of the drawbacks of existing datasets is the lack of experimental control with regards to the presence and/or degree of LDDs. This lack of control limits the analysis of model performance in relation to the specific challenge posed by LDDs. One way to address this is to use synthetic data having the properties of subregular languages. The degree of LDDs within the generated data can be controlled through the k parameter, length of the generated strings, and by choosing appropriate forbidden strings. In this paper, we explore the capacity of different RNN extensions to model LDDs, by evaluating these models on a sequence of SPk synthesized datasets, where each subsequent dataset exhibits a longer degree of LDD. Even though SPk are simple languages, the presence of LDDs does have significant impact on the performance of recurrent neural architectures, thus making them prime candidate in benchmarking tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/06/2018

Understanding Recurrent Neural Architectures by Analyzing and Synthesizing Long Distance Dependencies in Benchmark Sequential Datasets

At present, the state-of-the-art computational models across a range of ...
research
11/08/2020

On the Practical Ability of Recurrent Neural Networks to Recognize Hierarchical Languages

While recurrent models have been effective in NLP tasks, their performan...
research
11/18/2016

Variable Computation in Recurrent Neural Networks

Recurrent neural networks (RNNs) have been used extensively and with inc...
research
12/08/2020

Mutual Information Decay Curves and Hyper-Parameter Grid Search Design for Recurrent Neural Architectures

We present an approach to design the grid searches for hyper-parameter o...
research
07/13/2019

Multi-Element Long Distance Dependencies: Using SPk Languages to Explore the Characteristics of Long-Distance Dependencies

In order to successfully model Long Distance Dependencies (LDDs) it is n...
research
11/08/2019

Char-RNN and Active Learning for Hashtag Segmentation

We explore the abilities of character recurrent neural network (char-RNN...
research
02/10/2021

Differentiable Generative Phonology

The goal of generative phonology, as formulated by Chomsky and Halle (19...

Please sign up or login with your details

Forgot password? Click here to reset