Learning on tree architectures outperforms a convolutional feedforward network

11/21/2022
by   Yuval Meir, et al.
0

Advanced deep learning architectures consist of tens of fully connected and convolutional hidden layers, which are already extended to hundreds, and are far from their biological realization. Their implausible biological dynamics is based on changing a weight in a non-local manner, as the number of routes between an output unit and a weight is typically large, using the backpropagation technique. Here, offline and online CIFAR-10 database learning on 3-layer tree architectures, inspired by experimental-based dendritic tree adaptations, outperforms the achievable success rates of the 5-layer convolutional LeNet. Its highly pruning tree backpropagation procedure, where a single route connects an output unit and a weight, represents an efficient dendritic deep learning.

READ FULL TEXT

page 1

page 11

page 14

page 19

page 20

research
03/10/2023

Enhancing the success rates by performing pooling decisions adjacent to the output layer

Learning classification tasks of (2^nx2^n) inputs typically consist of ≤...
research
11/17/2017

Deep supervised learning using local errors

Error backpropagation is a highly effective mechanism for learning high-...
research
11/15/2022

Efficient shallow learning as an alternative to deep learning

The realization of complex classification tasks requires training of dee...
research
02/28/2020

Two Routes to Scalable Credit Assignment without Weight Symmetry

The neural plausibility of backpropagation has long been disputed, prima...
research
01/27/2022

Error-driven Input Modulation: Solving the Credit Assignment Problem without a Backward Pass

Supervised learning in artificial neural networks typically relies on ba...
research
05/29/2023

The mechanism underlying successful deep learning

Deep architectures consist of tens or hundreds of convolutional layers (...
research
10/27/2021

Latent Equilibrium: A unified learning theory for arbitrarily fast computation with arbitrarily slow neurons

The response time of physical computational elements is finite, and neur...

Please sign up or login with your details

Forgot password? Click here to reset