Beyond Geometry: Comparing the Temporal Structure of Computation in Neural Circuits with Dynamical Similarity Analysis

06/16/2023
by   Mitchell Ostrow, et al.
0

How can we tell whether two neural networks are utilizing the same internal processes for a particular computation? This question is pertinent for multiple subfields of both neuroscience and machine learning, including neuroAI, mechanistic interpretability, and brain-machine interfaces. Standard approaches for comparing neural networks focus on the spatial geometry of latent states. Yet in recurrent networks, computations are implemented at the level of neural dynamics, which do not have a simple one-to-one mapping with geometry. To bridge this gap, we introduce a novel similarity metric that compares two systems at the level of their dynamics. Our method incorporates two components: Using recent advances in data-driven dynamical systems theory, we learn a high-dimensional linear system that accurately captures core features of the original nonlinear dynamics. Next, we compare these linear approximations via a novel extension of Procrustes Analysis that accounts for how vector fields change under orthogonal transformation. Via four case studies, we demonstrate that our method effectively identifies and distinguishes dynamic structure in recurrent neural networks (RNNs), whereas geometric methods fall short. We additionally show that our method can distinguish learning rules in an unsupervised manner. Our method therefore opens the door to novel data-driven analyses of the temporal structure of neural computation, and to more rigorous testing of RNNs as models of the brain.

READ FULL TEXT

page 9

page 18

research
08/02/2021

Representation learning for neural population activity with Neural Data Transformers

Neural population activity is theorized to reflect an underlying dynamic...
research
04/14/2021

Neural population geometry: An approach for understanding biological and artificial neural networks

Advances in experimental neuroscience have transformed our ability to ex...
research
05/06/2021

Metric Entropy Limits on Recurrent Neural Network Learning of Linear Dynamical Systems

One of the most influential results in neural network theory is the univ...
research
05/18/2023

Learning low-dimensional dynamics from whole-brain data improves task capture

The neural dynamics underlying brain activity are critical to understand...
research
06/12/2023

On the Dynamics of Learning Time-Aware Behavior with Recurrent Neural Networks

Recurrent Neural Networks (RNNs) have shown great success in modeling ti...
research
03/13/2014

Controlling Recurrent Neural Networks by Conceptors

The human brain is a dynamical system whose extremely complex sensor-dri...
research
04/06/2023

Interpretable statistical representations of neural population dynamics and geometry

The dynamics of neuron populations during diverse tasks often evolve on ...

Please sign up or login with your details

Forgot password? Click here to reset