Stronger Separation of Analog Neuron Hierarchy by Deterministic Context-Free Languages

02/02/2021
by   Jiří Šíma, et al.
0

We analyze the computational power of discrete-time recurrent neural networks (NNs) with the saturated-linear activation function within the Chomsky hierarchy. This model restricted to integer weights coincides with binary-state NNs with the Heaviside activation function, which are equivalent to finite automata (Chomsky level 3) recognizing regular languages (REG), while rational weights make this model Turing-complete even for three analog-state units (Chomsky level 0). For the intermediate model αANN of a binary-state NN that is extended with α≥ 0 extra analog-state neurons with rational weights, we have established the analog neuron hierarchy 0ANNs ⊂ 1ANNs ⊂ 2ANNs ⊆ 3ANNs. The separation 1ANNs ⫋ 2ANNs has been witnessed by the non-regular deterministic context-free language (DCFL) L_#={0^n1^n| n≥ 1} which cannot be recognized by any 1ANN even with real weights, while any DCFL (Chomsky level 2) is accepted by a 2ANN with rational weights. In this paper, we strengthen this separation by showing that any non-regular DCFL cannot be recognized by 1ANNs with real weights, which means (DCFLs ∖ REG) ⊂ (2ANNs ∖ 1ANNs), implying 1ANNs ∩ DCFLs = 0ANNs. For this purpose, we have shown that L_# is the simplest non-regular DCFL by reducing L_# to any language in this class, which is by itself an interesting achievement in computability theory.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/21/2022

MRAM-based Analog Sigmoid Function for In-memory Computing

We propose an analog implementation of the transcendental activation fun...
research
05/17/2020

Separation of Memory and Processing in Dual Recurrent Neural Networks

We explore a neural network architecture that stacks a recurrent layer a...
research
09/10/2018

Regular omega-Languages with an Informative Right Congruence

A regular language is almost fully characterized by its right congruence...
research
12/17/2021

Intersection and Union Hierarchies of Deterministic Context-Free Languages and Pumping Lemmas

We study the computational complexity of finite intersections and unions...
research
02/21/2015

Positive Neural Networks in Discrete Time Implement Monotone-Regular Behaviors

We study the expressive power of positive neural networks. The model use...
research
06/18/2020

Stability of Internal States in Recurrent Neural Networks Trained on Regular Languages

We provide an empirical study of the stability of recurrent neural netwo...
research
12/09/2015

Stochastic Interpretation of Quasi-periodic Event-based Systems

Many networks used in machine learning and as models of biological neura...

Please sign up or login with your details

Forgot password? Click here to reset