Analyzing Populations of Neural Networks via Dynamical Model Embedding

02/27/2023
by   Jordan Cotler, et al.
0

A core challenge in the interpretation of deep neural networks is identifying commonalities between the underlying algorithms implemented by distinct networks trained for the same task. Motivated by this problem, we introduce DYNAMO, an algorithm that constructs low-dimensional manifolds where each point corresponds to a neural network model, and two points are nearby if the corresponding neural networks enact similar high-level computational processes. DYNAMO takes as input a collection of pre-trained neural networks and outputs a meta-model that emulates the dynamics of the hidden states as well as the outputs of any model in the collection. The specific model to be emulated is determined by a model embedding vector that the meta-model takes as input; these model embedding vectors constitute a manifold corresponding to the given population of models. We apply DYNAMO to both RNNs and CNNs, and find that the resulting model embedding spaces enable novel applications: clustering of neural networks on the basis of their high-level computational processes in a manner that is less sensitive to reparameterization; model averaging of several neural networks trained on the same task to arrive at a new, operable neural network with similar task performance; and semi-supervised learning via optimization on the model embedding space. Using a fixed-point analysis of meta-models trained on populations of RNNs, we gain new insights into how similarities of the topology of RNN dynamics correspond to similarities of their high-level computational processes.

READ FULL TEXT

page 9

page 10

page 16

research
06/14/2018

Insights on representational similarity in neural networks with canonical correlation

Comparing different neural network representations and determining how r...
research
07/19/2019

Universality and individuality in neural dynamics across large populations of recurrent networks

Task-based modeling with recurrent neural networks (RNNs) has emerged as...
research
11/01/2021

Reverse engineering recurrent neural networks with Jacobian switching linear dynamical systems

Recurrent neural networks (RNNs) are powerful models for processing time...
research
12/07/2022

Expressive architectures enhance interpretability of dynamics-based neural population models

Artificial neural networks that can recover latent dynamics from recorde...
research
10/31/2022

A picture of the space of typical learnable tasks

We develop a technique to analyze representations learned by deep networ...
research
12/02/2022

Identifying Hamiltonian manifold in neural networks

Recent studies to learn physical laws via deep learning attempt to find ...
research
08/12/2022

Siamese neural networks for a generalized, quantitative comparison of complex model outputs

Computational models are quantitative representations of systems. By ana...

Please sign up or login with your details

Forgot password? Click here to reset