Hardware-efficient on-line learning through pipelined truncated-error backpropagation in binary-state networks

06/15/2017
by   Hesham Mostafa, et al.
0

Artificial neural networks (ANNs) trained using backpropagation are powerful learning architectures that have achieved state-of-the-art performance in various benchmarks. Significant effort has been devoted to developing custom silicon devices to accelerate inference in ANNs. Accelerating the training phase, however, has attracted relatively little attention. In this paper, we describe a hardware-efficient on-line learning technique for feedforward multi-layer ANNs that is based on pipelined backpropagation. Learning is performed in parallel with inference in the forward pass, removing the need for an explicit backward pass and requiring no extra weight lookup. By using binary state variables in the feedforward network and ternary errors in truncated-error backpropagation, the need for any multiplications in the forward and backward passes is removed, and memory requirements for the pipelining are drastically reduced. Further reduction in addition operations owing to the sparsity in the forward neural and backpropagating error signal paths contributes to highly efficient hardware implementation. For proof-of-concept validation, we demonstrate on-line learning of MNIST handwritten digit classification on a Spartan 6 FPGA interfacing with an external 1Gb DDR2 DRAM, that shows small degradation in test error performance compared to an equivalently sized binary ANN trained off-line using standard back-propagation and exact errors. Our results highlight an attractive synergy between pipelined backpropagation and binary-state networks in substantially reducing computation and memory requirements, making pipelined on-line learning practical in deep networks.

READ FULL TEXT

page 10

page 16

research
04/04/2022

Forward Signal Propagation Learning

We propose a new learning algorithm for propagating a learning signal an...
research
11/17/2017

Deep supervised learning using local errors

Error backpropagation is a highly effective mechanism for learning high-...
research
09/03/2019

Learning without feedback: Direct random target projection as a feedback-alignment algorithm with layerwise feedforward training

While the backpropagation of error algorithm allowed for a rapid rise in...
research
07/20/2022

A Temporally and Spatially Local Spike-based Backpropagation Algorithm to Enable Training in Hardware

Spiking Neural Networks (SNNs) have emerged as a hardware efficient arch...
research
03/25/2021

Enabling Incremental Training with Forward Pass for Edge Devices

Deep Neural Networks (DNNs) are commonly deployed on end devices that ex...
research
03/31/2022

Stochastic Backpropagation: A Memory Efficient Strategy for Training Video Models

We propose a memory efficient method, named Stochastic Backpropagation (...
research
01/22/2016

Bitwise Neural Networks

Based on the assumption that there exists a neural network that efficien...

Please sign up or login with your details

Forgot password? Click here to reset