RadiX-Net: Structured Sparse Matrices for Deep Neural Networks

04/30/2019
by   Ryan A. Robinett, et al.
0

The sizes of deep neural networks (DNNs) are rapidly outgrowing the capacity of hardware to store and train them. Research over the past few decades has explored the prospect of sparsifying DNNs before, during, and after training by pruning edges from the underlying topology. The resulting neural network is known as a sparse neural network. More recent work has demonstrated the remarkable result that certain sparse DNNs can train to the same precision as dense DNNs at lower runtime and storage cost. An intriguing class of these sparse DNNs is the X-Nets, which are initialized and trained upon a sparse topology with neither reference to a parent dense DNN nor subsequent pruning. We present an algorithm that deterministically generates RadiX-Nets: sparse DNN topologies that, as a whole, are much more diverse than X-Net topologies, while preserving X-Nets' desired characteristics. We further present a functional-analytic conjecture based on the longstanding observation that sparse neural network topologies can attain the same expressive power as dense counterparts

READ FULL TEXT
research
09/14/2018

Neural Network Topologies for Sparse Training

The sizes of deep neural networks (DNNs) are rapidly outgrowing the capa...
research
12/19/2019

Deep Connectomics Networks: Neural Network Architectures Inspired by Neuronal Networks

The interplay between inter-neuronal network topology and cognition has ...
research
04/21/2020

How to Train your DNN: The Network Operator Edition

Deep Neural Nets have hit quite a crest, But physical networks are where...
research
06/23/2021

AC/DC: Alternating Compressed/DeCompressed Training of Deep Neural Networks

The increasing computational requirements of deep neural networks (DNNs)...
research
03/02/2020

Sparsity Meets Robustness: Channel Pruning for the Feynman-Kac Formalism Principled Robust Deep Neural Nets

Deep neural nets (DNNs) compression is crucial for adaptation to mobile ...
research
01/13/2022

Examining and Mitigating the Impact of Crossbar Non-idealities for Accurate Implementation of Sparse Deep Neural Networks

Recently several structured pruning techniques have been introduced for ...
research
02/19/2020

NeuroFabric: Identifying Ideal Topologies for Training A Priori Sparse Networks

Long training times of deep neural networks are a bottleneck in machine ...

Please sign up or login with your details

Forgot password? Click here to reset