Identifying Learning Rules From Neural Network Observables

10/22/2020
by   Aran Nayebi, et al.
7

The brain modifies its synaptic strengths during learning in order to better adapt to its environment. However, the underlying plasticity rules that govern learning are unknown. Many proposals have been suggested, including Hebbian mechanisms, explicit error backpropagation, and a variety of alternatives. It is an open question as to what specific experimental measurements would need to be made to determine whether any given learning rule is operative in a real biological system. In this work, we take a "virtual experimental" approach to this problem. Simulating idealized neuroscience experiments with artificial neural networks, we generate a large-scale dataset of learning trajectories of aggregate statistics measured in a variety of neural network architectures, loss functions, learning rule hyperparameters, and parameter initializations. We then take a discriminative approach, training linear and simple non-linear classifiers to identify learning rules from features based on these observables. We show that different classes of learning rules can be separated solely on the basis of aggregate statistics of the weights, activations, or instantaneous layer-wise activity changes, and that these results generalize to limited access to the trajectory and held-out architectures and learning curricula. We identify the statistics of each observable that are most relevant for rule identification, finding that statistics from network activities across training are more robust to unit undersampling and measurement noise than those obtained from the synaptic strengths. Our results suggest that activation patterns, available from electrophysiological recordings of post-synaptic activities on the order of several hundred units, frequently measured at wider intervals over the course of learning, may provide a good basis on which to identify learning rules.

READ FULL TEXT

page 2

page 10

page 11

page 12

page 13

page 14

page 15

page 17

research
11/14/2014

A framework for studying synaptic plasticity with neural spike train data

Learning and memory in the brain are implemented by complex, time-varyin...
research
03/22/2019

Learning with Delayed Synaptic Plasticity

The plasticity property of biological neural networks allows them to per...
research
06/22/2015

A Theory of Local Learning, the Learning Channel, and the Optimality of Backpropagation

In a physical neural system, where storage and processing are intimately...
research
03/17/2021

Augmenting Supervised Learning by Meta-learning Unsupervised Local Rules

The brain performs unsupervised learning and (perhaps) simultaneous supe...
research
03/17/2023

A Two-Step Rule for Backpropagation

We present a simplified computational rule for the back-propagation form...
research
04/02/2016

Stability of Analytic Neural Networks with Event-triggered Synaptic Feedbacks

In this paper, we investigate stability of a class of analytic neural ne...

Please sign up or login with your details

Forgot password? Click here to reset