Temporal Contrastive Learning for Spiking Neural Networks

by   Haonan Qiu, et al.

Biologically inspired spiking neural networks (SNNs) have garnered considerable attention due to their low-energy consumption and spatio-temporal information processing capabilities. Most existing SNNs training methods first integrate output information across time steps, then adopt the cross-entropy (CE) loss to supervise the prediction of the average representations. However, in this work, we find the method above is not ideal for the SNNs training as it omits the temporal dynamics of SNNs and degrades the performance quickly with the decrease of inference time steps. One tempting method to model temporal correlations is to apply the same label supervision at each time step and treat them identically. Although it can acquire relatively consistent performance across various time steps, it still faces challenges in obtaining SNNs with high performance. Inspired by these observations, we propose Temporal-domain supervised Contrastive Learning (TCL) framework, a novel method to obtain SNNs with low latency and high performance by incorporating contrastive supervision with temporal domain information. Contrastive learning (CL) prompts the network to discern both consistency and variability in the representation space, enabling it to better learn discriminative and generalizable features. We extend this concept to the temporal domain of SNNs, allowing us to flexibly and fully leverage the correlation between representations at different time steps. Furthermore, we propose a Siamese Temporal-domain supervised Contrastive Learning (STCL) framework to enhance the SNNs via augmentation, temporal and class constraints simultaneously. Extensive experimental results demonstrate that SNNs trained by our TCL and STCL can achieve both high performance and low latency, achieving state-of-the-art performance on a variety of datasets (e.g., CIFAR-10, CIFAR-100, and DVS-CIFAR10).


page 1

page 2

page 3

page 4


Ultra-low Latency Adaptive Local Binary Spiking Neural Network with Accuracy Loss Estimator

Spiking neural network (SNN) is a brain-inspired model which has more sp...

Direct Training via Backpropagation for Ultra-low Latency Spiking Neural Networks with Multi-threshold

Spiking neural networks (SNNs) can utilize spatio-temporal information a...

A Graph is Worth 1-bit Spikes: When Graph Contrastive Learning Meets Spiking Neural Networks

While contrastive self-supervised learning has become the de-facto learn...

Spikeformer: A Novel Architecture for Training High-Performance Low-Latency Spiking Neural Network

Spiking neural networks (SNNs) have made great progress on both performa...

Online Training Through Time for Spiking Neural Networks

Spiking neural networks (SNNs) are promising brain-inspired energy-effic...

DCT-SNN: Using DCT to Distribute Spatial Information over Time for Learning Low-Latency Spiking Neural Networks

Spiking Neural Networks (SNNs) offer a promising alternative to traditio...

Spatio-Temporal Graph Contrastive Learning

Deep learning models are modern tools for spatio-temporal graph (STG) fo...

Please sign up or login with your details

Forgot password? Click here to reset