Synaptic Weight Distributions Depend on the Geometry of Plasticity

by   Roman Pogodin, et al.

Most learning algorithms in machine learning rely on gradient descent to adjust model parameters, and a growing literature in computational neuroscience leverages these ideas to study synaptic plasticity in the brain. However, the vast majority of this work ignores a critical underlying assumption: the choice of distance for synaptic changes (i.e. the geometry of synaptic plasticity). Gradient descent assumes that the distance is Euclidean, but many other distances are possible, and there is no reason that biology necessarily uses Euclidean geometry. Here, using the theoretical tools provided by mirror descent, we show that, regardless of the loss being minimized, the distribution of synaptic weights will depend on the geometry of synaptic plasticity. We use these results to show that experimentally-observed log-normal weight distributions found in several brain areas are not consistent with standard gradient descent (i.e. a Euclidean geometry), but rather with non-Euclidean distances. Finally, we show that it should be possible to experimentally test for different synaptic geometries by comparing synaptic weight distributions before and after learning. Overall, this work shows that the current paradigm in theoretical work on synaptic plasticity that assumes Euclidean synaptic geometry may be misguided and that it should be possible to experimentally determine the true geometry of synaptic plasticity in the brain.


page 1

page 2

page 3

page 4


Natural-gradient learning for spiking neurons

In many normative theories of synaptic plasticity, weight updates implic...

Learning fixed points of recurrent neural networks by reparameterizing the network model

In computational neuroscience, fixed points of recurrent neural networks...

Phenomenological modeling of diverse and heterogeneous synaptic dynamics at natural density

This chapter sheds light on the synaptic organization of the brain from ...

The Digital Synaptic Neural Substrate: Size and Quality Matters

We investigate the 'Digital Synaptic Neural Substrate' (DSNS) computatio...

Using inspiration from synaptic plasticity rules to optimize traffic flow in distributed engineered networks

Controlling the flow and routing of data is a fundamental problem in man...

Contrastive Hebbian Learning with Random Feedback Weights

Neural networks are commonly trained to make predictions through learnin...

Please sign up or login with your details

Forgot password? Click here to reset