Latent Equilibrium: A unified learning theory for arbitrarily fast computation with arbitrarily slow neurons

10/27/2021
by   Paul Haider, et al.
0

The response time of physical computational elements is finite, and neurons are no exception. In hierarchical models of cortical networks each layer thus introduces a response lag. This inherent property of physical dynamical systems results in delayed processing of stimuli and causes a timing mismatch between network output and instructive signals, thus afflicting not only inference, but also learning. We introduce Latent Equilibrium, a new framework for inference and learning in networks of slow components which avoids these issues by harnessing the ability of biological neurons to phase-advance their output with respect to their membrane potential. This principle enables quasi-instantaneous inference independent of network depth and avoids the need for phased plasticity or computationally expensive network relaxation phases. We jointly derive disentangled neuron and synapse dynamics from a prospective energy function that depends on a network's generalized position and momentum. The resulting model can be interpreted as a biologically plausible approximation of error backpropagation in deep cortical networks with continuous-time, leaky neuronal dynamics and continuously active, local plasticity. We demonstrate successful learning of standard benchmark datasets, achieving competitive performance using both fully-connected and convolutional architectures, and show how our principle can be applied to detailed models of cortical microcircuitry. Furthermore, we study the robustness of our model to spatio-temporal substrate imperfections to demonstrate its feasibility for physical realization, be it in vivo or in silico.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/30/2019

Temporal coding in spiking neural networks with alpha synaptic function

The timing of individual neuronal spikes is essential for biological bra...
research
06/15/2020

Equilibrium Propagation for Complete Directed Neural Networks

Artificial neural networks, one of the most successful approaches to sup...
research
10/23/2020

A biologically plausible neural network for Slow Feature Analysis

Learning latent features from time series data is an important problem i...
research
09/25/2016

Learning by Stimulation Avoidance: A Principle to Control Spiking Neural Networks Dynamics

Learning based on networks of real neurons, and by extension biologicall...
research
05/28/2019

Inference with Hybrid Bio-hardware Neural Networks

To understand the learning process in brains, biologically plausible alg...
research
10/12/2022

Dynamic neuronal networks efficiently achieve classification in robotic interactions with real-world objects

Biological cortical networks are potentially fully recurrent networks wi...
research
11/21/2022

Learning on tree architectures outperforms a convolutional feedforward network

Advanced deep learning architectures consist of tens of fully connected ...

Please sign up or login with your details

Forgot password? Click here to reset