MINT: Multiplier-less Integer Quantization for Spiking Neural Networks

05/16/2023
by   Ruokai Yin, et al.
0

We propose Multiplier-less INTeger (MINT) quantization, an efficient uniform quantization scheme for the weights and membrane potentials in spiking neural networks (SNNs). Unlike prior SNN quantization works, MINT quantizes the memory-hungry membrane potentials to extremely low bit-width (2-bit) to significantly reduce the total memory footprint. Additionally, MINT quantization shares the quantization scale between the weights and membrane potentials, eliminating the need for multipliers and floating arithmetic units, which are required by the standard uniform quantization. Experimental results demonstrate that our proposed method achieves accuracy that matches other state-of-the-art SNN quantization works while outperforming them on total memory footprint and hardware cost at deployment time. For instance, 2-bit MINT VGG-16 achieves 48.6 full-precision baseline) with approximately 93.8 footprint from the full-precision model; meanwhile, our model reduces area by 93

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset