Topological Insights in Sparse Neural Networks

06/24/2020
by   Shiwei Liu, et al.
24

Sparse neural networks are effective approaches to reduce the resource requirements for the deployment of deep neural networks. Recently, the concept of adaptive sparse connectivity, has emerged to allow training sparse neural networks from scratch by optimizing the sparse structure during training. However, comparing different sparse topologies and determining how sparse topologies evolve during training, especially for the situation in which the sparse structure optimization is involved, remain as challenging open questions. This comparison becomes increasingly complex as the number of possible topological comparisons increases exponentially with the size of networks. In this work, we introduce an approach to understand and compare sparse neural network topologies from the perspective of graph theory. We first propose Neural Network Sparse Topology Distance (NNSTD) to measure the distance between different sparse neural networks. Further, we demonstrate that sparse neural networks can outperform over-parameterized models in terms of performance, even without any further structure optimization. To the end, we also show that adaptive sparse connectivity can always unveil a plenitude of sparse sub-networks with very different topologies which outperform the dense model, by quantifying and comparing their topological evolutionary processes. The latter findings complement the Lottery Ticket Hypothesis by showing that there is a much more efficient and robust way to find "winning tickets". Altogether, our results start enabling a better theoretical understanding of sparse neural networks, and demonstrate the utility of using graph theory to analyze them.

READ FULL TEXT

page 10

page 11

page 12

research
09/30/2018

Pruned and Structurally Sparse Neural Networks

Advances in designing and training deep neural networks have led to the ...
research
03/17/2019

Evolving and Understanding Sparse Deep Neural Networks using Cosine Similarity

Training sparse neural networks with adaptive connectivity is an active ...
research
08/19/2020

Learning Connectivity of Neural Networks from a Topological Perspective

Seeking effective neural networks is a critical and practical field in d...
research
06/27/2019

On improving deep learning generalization with adaptive sparse connectivity

Large neural networks are very successful in various tasks. However, wit...
research
02/19/2020

NeuroFabric: Identifying Ideal Topologies for Training A Priori Sparse Networks

Long training times of deep neural networks are a bottleneck in machine ...
research
03/14/2023

Vision-based route following by an embodied insect-inspired sparse neural network

We compared the efficiency of the FlyHash model, an insect-inspired spar...

Please sign up or login with your details

Forgot password? Click here to reset