FastHebb: Scaling Hebbian Training of Deep Neural Networks to ImageNet Level

07/07/2022
by   Gabriele Lagani, et al.
0

Learning algorithms for Deep Neural Networks are typically based on supervised end-to-end Stochastic Gradient Descent (SGD) training with error backpropagation (backprop). Backprop algorithms require a large number of labelled training samples to achieve high performance. However, in many realistic applications, even if there is plenty of image samples, very few of them are labelled, and semi-supervised sample-efficient training strategies have to be used. Hebbian learning represents a possible approach towards sample efficient training; however, in current solutions, it does not scale well to large datasets. In this paper, we present FastHebb, an efficient and scalable solution for Hebbian learning which achieves higher efficiency by 1) merging together update computation and aggregation over a batch of inputs, and 2) leveraging efficient matrix multiplication algorithms on GPU. We validate our approach on different computer vision benchmarks, in a semi-supervised learning scenario. FastHebb outperforms previous solutions by up to 50 times in terms of training speed, and notably, for the first time, we are able to bring Hebbian algorithms to ImageNet scale.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/08/2020

Empirical Perspectives on One-Shot Semi-supervised Learning

One of the greatest obstacles in the adoption of deep neural networks fo...
research
05/28/2019

Local Label Propagation for Large-Scale Semi-Supervised Learning

A significant issue in training deep neural networks to solve supervised...
research
03/16/2021

Hebbian Semi-Supervised Learning in a Sample Efficiency Setting

We propose to address the issue of sample efficiency, in Deep Convolutio...
research
10/03/2016

Semi-supervised Learning with Sparse Autoencoders in Phone Classification

We propose the application of a semi-supervised learning method to impro...
research
09/12/2022

SELTO: Sample-Efficient Learned Topology Optimization

We present a sample-efficient deep learning strategy for topology optimi...
research
10/19/2022

An Optimization-Based Supervised Learning Algorithm for PXRD Phase Fraction Estimation

In powder diffraction data analysis, phase identification is the process...
research
07/26/2019

Scalable Semi-Supervised SVM via Triply Stochastic Gradients

Semi-supervised learning (SSL) plays an increasingly important role in t...

Please sign up or login with your details

Forgot password? Click here to reset