Low-Dimensional Manifolds Support Multiplexed Integrations in Recurrent Neural Networks

11/20/2020
by   Arnaud Fanthomme, et al.
0

We study the learning dynamics and the representations emerging in Recurrent Neural Networks trained to integrate one or multiple temporal signals. Combining analytical and numerical investigations, we characterize the conditions under which a RNN with n neurons learns to integrate D(n) scalar signals of arbitrary duration. We show, both for linear and ReLU neurons, that its internal state lives close to a D-dimensional manifold, whose shape is related to the activation function. Each neuron therefore carries, to various degrees, information about the value of all integrals. We discuss the deep analogy between our results and the concept of mixed selectivity forged by computational neuroscientists to interpret cortical recordings.

READ FULL TEXT

page 17

page 36

research
06/18/2020

Stability of Internal States in Recurrent Neural Networks Trained on Regular Languages

We provide an empirical study of the stability of recurrent neural netwo...
research
05/11/2015

Improving neural networks with bunches of neurons modeled by Kumaraswamy units: Preliminary study

Deep neural networks have recently achieved state-of-the-art results in ...
research
10/23/2016

Stochastic inference with spiking neurons in the high-conductance state

The highly variable dynamics of neocortical circuits observed in vivo ha...
research
05/05/2020

Recurrent Neural Network Learning of Performance and Intrinsic Population Dynamics from Sparse Neural Data

Recurrent Neural Networks (RNNs) are popular models of brain function. T...
research
11/13/2013

Stochastic inference with deterministic spiking neurons

The seemingly stochastic transient dynamics of neocortical circuits obse...
research
02/09/2022

Stability Analysis of Recurrent Neural Networks by IQC with Copositive Mutipliers

This paper is concerned with the stability analysis of the recurrent neu...
research
05/17/2020

Separation of Memory and Processing in Dual Recurrent Neural Networks

We explore a neural network architecture that stacks a recurrent layer a...

Please sign up or login with your details

Forgot password? Click here to reset