Probabilistic Modeling: Proving the Lottery Ticket Hypothesis in Spiking Neural Network

05/20/2023
by   Man Yao, et al.
0

The Lottery Ticket Hypothesis (LTH) states that a randomly-initialized large neural network contains a small sub-network (i.e., winning tickets) which, when trained in isolation, can achieve comparable performance to the large network. LTH opens up a new path for network pruning. Existing proofs of LTH in Artificial Neural Networks (ANNs) are based on continuous activation functions, such as ReLU, which satisfying the Lipschitz condition. However, these theoretical methods are not applicable in Spiking Neural Networks (SNNs) due to the discontinuous of spiking function. We argue that it is possible to extend the scope of LTH by eliminating Lipschitz condition. Specifically, we propose a novel probabilistic modeling approach for spiking neurons with complicated spatio-temporal dynamics. Then we theoretically and experimentally prove that LTH holds in SNNs. According to our theorem, we conclude that pruning directly in accordance with the weight size in existing SNNs is clearly not optimal. We further design a new criterion for pruning based on our theory, which achieves better pruning results than baseline.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/25/2020

Data-dependent Pruning to find the Winning Lottery Ticket

The Lottery Ticket Hypothesis postulates that a freshly initialized neur...
research
10/27/2020

Spiking Neural Networks – Part II: Detecting Spatio-Temporal Patterns

Inspired by the operation of biological brains, Spiking Neural Networks ...
research
07/04/2022

Lottery Ticket Hypothesis for Spiking Neural Networks

Spiking Neural Networks (SNNs) have recently emerged as a new generation...
research
06/22/2020

Logarithmic Pruning is All You Need

The Lottery Ticket Hypothesis is a conjecture that every large neural ne...
research
05/11/2021

Pruning of Deep Spiking Neural Networks through Gradient Rewiring

Spiking Neural Networks (SNNs) have been attached great importance due t...
research
02/13/2023

Workload-Balanced Pruning for Sparse Spiking Neural Networks

Pruning for Spiking Neural Networks (SNNs) has emerged as a fundamental ...
research
08/31/2023

Artificial to Spiking Neural Networks Conversion for Scientific Machine Learning

We introduce a method to convert Physics-Informed Neural Networks (PINNs...

Please sign up or login with your details

Forgot password? Click here to reset