Quantization in Spiking Neural Networks

05/13/2023
by   Bernhard A. Moser, et al.
0

In spiking neural networks (SNN), at each node, an incoming sequence of weighted Dirac pulses is converted into an output sequence of weighted Dirac pulses by a leaky-integrate-and-fire (LIF) neuron model based on spike aggregation and thresholding. We show that this mapping can be understood as a quantization operator and state a corresponding formula for the quantization error by means of the Alexiewicz norm. This analysis has implications for rethinking re-initialization in the LIF model, leading to the proposal of 'reset-to-mod' as a modulo-based reset variant.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/09/2023

Spiking Neural Networks in the Alexiewicz Topology: A New Perspective on Analysis and Error Bounds

In order to ease the analysis of error propagation in neuromorphic compu...
research
10/29/2017

Training Probabilistic Spiking Neural Networks with First-to-spike Decoding

Third-generation neural networks, or Spiking Neural Networks (SNNs), aim...
research
08/13/2023

RMP-Loss: Regularizing Membrane Potential Distribution for Spiking Neural Networks

Spiking Neural Networks (SNNs) as one of the biology-inspired models hav...
research
05/31/2022

Linear Leaky-Integrate-and-Fire Neuron Model Based Spiking Neural Networks and Its Mapping Relationship to Deep Neural Networks

Spiking neural networks (SNNs) are brain-inspired machine learning algor...
research
07/05/2021

Q-SpiNN: A Framework for Quantizing Spiking Neural Networks

A prominent technique for reducing the memory footprint of Spiking Neura...
research
07/10/2023

InfLoR-SNN: Reducing Information Loss for Spiking Neural Networks

The Spiking Neural Network (SNN) has attracted more and more attention r...
research
06/22/2021

Backpropagated Neighborhood Aggregation for Accurate Training of Spiking Neural Networks

While backpropagation (BP) has been applied to spiking neural networks (...

Please sign up or login with your details

Forgot password? Click here to reset