Low Tensor Rank Learning of Neural Dynamics

08/22/2023
by   Arthur Pellegrino, et al.
0

Learning relies on coordinated synaptic changes in recurrently connected populations of neurons. Therefore, understanding the collective evolution of synaptic connectivity over learning is a key challenge in neuroscience and machine learning. In particular, recent work has shown that the weight matrices of task-trained RNNs are typically low rank, but how this low rank structure unfolds over learning is unknown. To address this, we investigate the rank of the 3-tensor formed by the weight matrices throughout learning. By fitting RNNs of varying rank to large-scale neural recordings during a motor learning task, we find that the inferred weights are low-tensor-rank and therefore evolve over a fixed low-dimensional subspace throughout the entire course of learning. We next validate the observation of low-tensor-rank learning on an RNN trained to solve the same task by performing a low-tensor-rank decomposition directly on the ground truth weights, and by showing that the method we applied to the data faithfully recovers this low rank structure. Finally, we present a set of mathematical results bounding the matrix and tensor ranks of gradient descent learning dynamics which show that low-tensor-rank weights emerge naturally in RNNs trained to solve low-dimensional tasks. Taken together, our findings provide novel constraints on the evolution of population connectivity over learning in both biological and artificial neural networks, and enable reverse engineering of learning-induced changes in recurrent network dynamics from large-scale neural recordings.

READ FULL TEXT
research
02/01/2023

Experimental observation on a low-rank tensor model for eigenvalue problems

Here we utilize a low-rank tensor model (LTM) as a function approximator...
research
05/22/2018

Low-Rank Tensor Decomposition via Multiple Reshaping and Reordering Operations

Tensor decomposition has been widely applied to find low-rank representa...
research
08/19/2020

LOCUS: A Novel Decomposition Method for Brain Network Connectivity Matrices using Low-rank Structure with Uniform Sparsity

Network-oriented research has been increasingly popular in many scientif...
research
10/22/2020

Beyond Lazy Training for Over-parameterized Tensor Decomposition

Over-parametrization is an important technique in training neural networ...
research
10/19/2020

Connecting Weighted Automata, Tensor Networks and Recurrent Neural Networks through Spectral Learning

In this paper, we present connections between three models used in diffe...
research
11/06/2017

Extracting low-dimensional dynamics from multiple large-scale neural population recordings by learning to predict correlations

A powerful approach for understanding neural population dynamics is to e...
research
10/27/2022

On the biological plausibility of orthogonal initialisation for solving gradient instability in deep neural networks

Initialising the synaptic weights of artificial neural networks (ANNs) w...

Please sign up or login with your details

Forgot password? Click here to reset