Towards Zero Memory Footprint Spiking Neural Network Training

08/16/2023
by   Bin Lei, et al.
0

Biologically-inspired Spiking Neural Networks (SNNs), processing information using discrete-time events known as spikes rather than continuous values, have garnered significant attention due to their hardware-friendly and energy-efficient characteristics. However, the training of SNNs necessitates a considerably large memory footprint, given the additional storage requirements for spikes or events, leading to a complex structure and dynamic setup. In this paper, to address memory constraint in SNN training, we introduce an innovative framework, characterized by a remarkably low memory footprint. We (i) design a reversible SNN node that retains a high level of accuracy. Our design is able to achieve a 58.65× reduction in memory usage compared to the current SNN node. We (ii) propose a unique algorithm to streamline the backpropagation process of our reversible SNN node. This significantly trims the backward Floating Point Operations Per Second (FLOPs), thereby accelerating the training process in comparison to current reversible layer backpropagation method. By using our algorithm, the training time is able to be curtailed by 23.8% relative to existing reversible layer architectures.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/24/2019

Reversible designs for extreme memory cost reduction of CNN training

Training Convolutional Neural Networks (CNN) is a resource intensive tas...
research
06/14/2019

A Partially Reversible U-Net for Memory-Efficient Volumetric Image Segmentation

One of the key drawbacks of 3D convolutional neural networks for segment...
research
07/15/2022

An Exact Bitwise Reversible Integrator

At a fundamental level most physical equations are time reversible. In t...
research
12/13/2021

Efficient Training of Spiking Neural Networks with Temporally-Truncated Local Backpropagation through Time

Directly training spiking neural networks (SNNs) has remained challengin...
research
06/15/2023

PaReprop: Fast Parallelized Reversible Backpropagation

The growing size of datasets and deep learning models has made faster an...
research
02/28/2023

Towards Memory- and Time-Efficient Backpropagation for Training Spiking Neural Networks

Spiking Neural Networks (SNNs) are promising energy-efficient models for...
research
04/28/2022

Schrödinger's FP: Dynamic Adaptation of Floating-Point Containers for Deep Learning Training

We introduce a software-hardware co-design approach to reduce memory tra...

Please sign up or login with your details

Forgot password? Click here to reset