Two-Step Spike Encoding Scheme and Architecture for Highly Sparse Spiking-Neural-Network
This paper proposes a two-step spike encoding scheme, which consists of the source encoding and the process encoding for a high energy-efficient spiking-neural-network (SNN) acceleration. The eigen-train generation and its superposition generate spike trains which show high accuracy with low spike ratio. Sparsity boosting (SB) and spike generation skipping (SGS) reduce the amount of operations for SNN. Time shrinking multi-level encoding (TS-MLE) compresses the number of spikes in a train along time axis, and spike-level clock skipping (SLCS) decreases the processing time. Eigen-train generation achieves 90.3 spike ratio for CIFAR-10 classification. SB reduces spike ratio by 0.49x with only 0.1 accuracy loss. TS-MLE and SLCS increases the throughput of SNN by 2.8x while decreasing the hardware resource for spike generator by 75 previous generators.
READ FULL TEXT