Dendritic cortical microcircuits approximate the backpropagation algorithm

10/26/2018
by   Joao Sacramento, et al.
0

Deep learning has seen remarkable developments over the last years, many of them inspired by neuroscience. However, the main learning mechanism behind these advances - error backpropagation - appears to be at odds with neurobiology. Here, we introduce a multilayer neuronal network model with simplified dendritic compartments in which error-driven synaptic plasticity adapts the network towards a global desired output. In contrast to previous work our model does not require separate phases and synaptic learning is driven by local dendritic prediction errors continuously in time. Such errors originate at apical dendrites and occur due to a mismatch between predictive input from lateral interneurons and activity from actual top-down feedback. Through the use of simple dendritic compartments and different cell-types our model can represent both error and normal activity within a pyramidal neuron. We demonstrate the learning capabilities of the model in regression and classification tasks, and show analytically that it approximates the error backpropagation algorithm. Moreover, our framework is consistent with recent observations of learning between brain areas and the architecture of cortical microcircuits. Overall, we introduce a novel view of learning on dendritic cortical circuits and on how the brain may solve the long-standing synaptic credit assignment problem.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/30/2017

Dendritic error backpropagation in deep cortical microcircuits

Animal behaviour depends on learning to associate sensory stimuli with t...
research
11/15/2019

Ghost Units Yield Biologically Plausible Backprop in Deep Neural Networks

In the past few years, deep learning has transformed artificial intellig...
research
06/23/2022

Single-phase deep learning in cortico-cortical networks

The error-backpropagation (backprop) algorithm remains the most common s...
research
03/26/2023

Lazy learning: a biologically-inspired plasticity rule for fast and energy efficient synaptic plasticity

When training neural networks for classification tasks with backpropagat...
research
05/28/2019

Using local plasticity rules to train recurrent neural networks

To learn useful dynamics on long time scales, neurons must use plasticit...
research
06/15/2021

Credit Assignment in Neural Networks through Deep Feedback Control

The success of deep learning sparked interest in whether the brain learn...
research
06/05/2020

Brain-inspired global-local hybrid learning towards human-like intelligence

The combination of neuroscience-oriented and computer-science-oriented a...

Please sign up or login with your details

Forgot password? Click here to reset