DeepAI
Log In Sign Up

Using local plasticity rules to train recurrent neural networks

05/28/2019
by   Owen Marschall, et al.
12

To learn useful dynamics on long time scales, neurons must use plasticity rules that account for long-term, circuit-wide effects of synaptic changes. In other words, neural circuits must solve a credit assignment problem to appropriately assign responsibility for global network behavior to individual circuit components. Furthermore, biological constraints demand that plasticity rules are spatially and temporally local; that is, synaptic changes can depend only on variables accessible to the pre- and postsynaptic neurons. While artificial intelligence offers a computational solution for credit assignment, namely backpropagation through time (BPTT), this solution is wildly biologically implausible. It requires both nonlocal computations and unlimited memory capacity, as any synaptic change is a complicated function of the entire history of network activity. Similar nonlocality issues plague other approaches such as FORCE (Sussillo et al. 2009). Overall, we are still missing a model for learning in recurrent circuits that both works computationally and uses only local updates. Leveraging recent advances in machine learning on approximating gradients for BPTT, we derive biologically plausible plasticity rules that enable recurrent networks to accurately learn long-term dependencies in sequential data. The solution takes the form of neurons with segregated voltage compartments, with several synaptic sub-populations that have different functional properties. The network operates in distinct phases during which each synaptic sub-population is updated by its own local plasticity rule. Our results provide new insights into the potential roles of segregated dendritic compartments, branch-specific inhibition, and global circuit phases in learning.

READ FULL TEXT VIEW PDF

page 1

page 2

page 3

10/01/2020

A biologically plausible neural network for multi-channel Canonical Correlation Analysis

Cortical pyramidal neurons receive inputs from multiple distinct neural ...
01/30/2018

ReNN: Rule-embedded Neural Networks

The artificial neural network shows powerful ability of inference, but i...
09/23/2019

AHA! an 'Artificial Hippocampal Algorithm' for Episodic Machine Learning

The majority of ML research concerns slow, statistical learning of i.i.d...
08/14/2017

A learning framework for winner-take-all networks with stochastic synapses

Many recent generative models make use of neural networks to transform t...
11/14/2020

Using noise to probe recurrent neural network structure and prune synapses

Many networks in the brain are sparsely connected, and the brain elimina...
11/21/2016

Using inspiration from synaptic plasticity rules to optimize traffic flow in distributed engineered networks

Controlling the flow and routing of data is a fundamental problem in man...
01/10/2020

Learning credit assignment

Deep learning has achieved impressive prediction accuracies in a variety...

References

  • [1] D. Sussillo and L. F. Abbott, “Generating coherent patterns of activity from chaotic neural networks,” Neuron, vol. 63, no. 4, pp. 544–557, 2009.
  • [2]

    S. Pitis, “Recurrent neural networks in tensorflow i.”

    https://r2rt.com/recurrent-neural-networks-in-tensorflow-i.html, 2016.
  • [3] M. Jaderberg, W. M. Czarnecki, S. Osindero, O. Vinyals, A. Graves, D. Silver, and K. Kavukcuoglu, “Decoupled neural interfaces using synthetic gradients,” in Proceedings of the 34th International Conference on Machine Learning-Volume 70, pp. 1627–1635, JMLR. org, 2017.
  • [4] J. T. Dudman, D. Tsay, and S. A. Siegelbaum, “A role for synaptic inputs at distal dendrites: instructive signals for hippocampal long-term plasticity,” Neuron, vol. 56, no. 5, pp. 866–879, 2007.
  • [5] P. Somogyi, L. Katona, T. Klausberger, B. Lasztóczi, and T. J. Viney, “Temporal redistribution of inhibition over neuronal subcellular domains underlies state-dependent rhythmic change of excitability in the hippocampus,” Philosophical Transactions of the Royal Society B: Biological Sciences, vol. 369, no. 1635, p. 20120518, 2014.
  • [6] R. Pascanu, T. Mikolov, and Y. Bengio, “On the difficulty of training recurrent neural networks,” in International conference on machine learning, pp. 1310–1318, 2013.
  • [7]

    J. Guerguiev, T. P. Lillicrap, and B. A. Richards, “Towards deep learning with segregated dendrites,”

    ELife, vol. 6, p. e22901, 2017.