Optimizing the Consumption of Spiking Neural Networks with Activity Regularization

04/04/2022
by   Simon Narduzzi, et al.
0

Reducing energy consumption is a critical point for neural network models running on edge devices. In this regard, reducing the number of multiply-accumulate (MAC) operations of Deep Neural Networks (DNNs) running on edge hardware accelerators will reduce the energy consumption during inference. Spiking Neural Networks (SNNs) are an example of bio-inspired techniques that can further save energy by using binary activations, and avoid consuming energy when not spiking. The networks can be configured for equivalent accuracy on a task through DNN-to-SNN conversion frameworks but their conversion is based on rate coding therefore the synaptic operations can be high. In this work, we look into different techniques to enforce sparsity on the neural network activation maps and compare the effect of different training regularizers on the efficiency of the optimized DNNs and SNNs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/24/2020

Exploring the Connection Between Binary and Spiking Neural Networks

On-chip edge intelligence has necessitated the exploration of algorithmi...
research
03/23/2022

Efficient Hardware Acceleration of Sparsely Active Convolutional Spiking Neural Networks

Spiking Neural Networks (SNNs) compute in an event-based matter to achie...
research
02/20/2022

Supervised Training of Siamese Spiking Neural Networks with Earth's Mover Distance

This study adapts the highly-versatile siamese neural network model to t...
research
08/10/2023

A Homomorphic Encryption Framework for Privacy-Preserving Spiking Neural Networks

Machine learning (ML) is widely used today, especially through deep neur...
research
04/29/2019

Neuromorphic Acceleration for Approximate Bayesian Inference on Neural Networks via Permanent Dropout

As neural networks have begun performing increasingly critical tasks for...
research
07/15/2021

Training for temporal sparsity in deep neural networks, application in video processing

Activation sparsity improves compute efficiency and resource utilization...
research
05/18/2023

SpikeCP: Delay-Adaptive Reliable Spiking Neural Networks via Conformal Prediction

Spiking neural networks (SNNs) process time-series data via internal eve...

Please sign up or login with your details

Forgot password? Click here to reset