Recurrent Neural Network Learning of Performance and Intrinsic Population Dynamics from Sparse Neural Data

05/05/2020
by   Alessandro Salatiello, et al.
0

Recurrent Neural Networks (RNNs) are popular models of brain function. The typical training strategy is to adjust their input-output behavior so that it matches that of the biological circuit of interest. Even though this strategy ensures that the biological and artificial networks perform the same computational task, it does not guarantee that their internal activity dynamics match. This suggests that the trained RNNs might end up performing the task employing a different internal computational mechanism, which would make them a suboptimal model of the biological circuit. In this work, we introduce a novel training strategy that allows learning not only the input-output behavior of an RNN but also its internal network dynamics, based on sparse neural recordings. We test the proposed method by training an RNN to simultaneously reproduce internal dynamics and output signals of a physiologically-inspired neural model. Specifically, this model generates the multiphasic muscle-like activity patterns typically observed during the execution of reaching movements, based on the oscillatory activation patterns concurrently observed in the motor cortex. Remarkably, we show that the reproduction of the internal dynamics is successful even when the training algorithm relies on the activities of a small subset of neurons sampled from the biological network. Furthermore, we show that training the RNNs with this method significantly improves their generalization performance. Overall, our results suggest that the proposed method is suitable for building powerful functional RNN models, which automatically capture important computational properties of the biological circuit of interest from sparse neural recordings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/18/2017

MinimalRNN: Toward More Interpretable and Trainable Recurrent Neural Networks

We introduce MinimalRNN, a new recurrent neural network architecture tha...
research
08/02/2021

Representation learning for neural population activity with Neural Data Transformers

Neural population activity is theorized to reflect an underlying dynamic...
research
09/07/2023

Brief technical note on linearizing recurrent neural networks (RNNs) before vs after the pointwise nonlinearity

Linearization of the dynamics of recurrent neural networks (RNNs) is oft...
research
04/13/2018

Neural Trajectory Analysis of Recurrent Neural Network In Handwriting Synthesis

Recurrent neural networks (RNNs) are capable of learning to generate hig...
research
02/25/2021

Neuroevolution of a Recurrent Neural Network for Spatial and Working Memory in a Simulated Robotic Environment

Animals ranging from rats to humans can demonstrate cognitive map capabi...
research
11/20/2020

Low-Dimensional Manifolds Support Multiplexed Integrations in Recurrent Neural Networks

We study the learning dynamics and the representations emerging in Recur...
research
10/09/2017

full-FORCE: A Target-Based Method for Training Recurrent Networks

Trained recurrent networks are powerful tools for modeling dynamic neura...

Please sign up or login with your details

Forgot password? Click here to reset