AC/DC: Alternating Compressed/DeCompressed Training of Deep Neural Networks

06/23/2021
by   Alexandra Peşte, et al.
0

The increasing computational requirements of deep neural networks (DNNs) have led to significant interest in obtaining DNN models that are sparse, yet accurate. Recent work has investigated the even harder case of sparse training, where the DNN weights are, for as much as possible, already sparse to reduce computational costs during training. Existing sparse training methods are mainly empirical and often have lower accuracy relative to the dense baseline. In this paper, we present a general approach called Alternating Compressed/DeCompressed (AC/DC) training of DNNs, demonstrate convergence for a variant of the algorithm, and show that AC/DC outperforms existing sparse training methods in accuracy at similar computational budgets; at high sparsity levels, AC/DC even outperforms existing methods that rely on accurate pre-trained dense models. An important property of AC/DC is that it allows co-training of dense and sparse models, yielding accurate sparse-dense model pairs at the end of the training process. This is useful in practice, where compressed variants may be desirable for deployment in resource-constrained settings without re-doing the entire training flow, and also provides us with insights into the accuracy gap between dense and compressed models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/10/2018

Hierarchical Block Sparse Neural Networks

Sparse deep neural networks(DNNs) are efficient in both memory and compu...
research
09/14/2018

Neural Network Topologies for Sparse Training

The sizes of deep neural networks (DNNs) are rapidly outgrowing the capa...
research
04/30/2019

RadiX-Net: Structured Sparse Matrices for Deep Neural Networks

The sizes of deep neural networks (DNNs) are rapidly outgrowing the capa...
research
01/22/2021

Selfish Sparse RNN Training

Sparse neural networks have been widely applied to reduce the necessary ...
research
09/13/2023

DNNShifter: An Efficient DNN Pruning System for Edge Computing

Deep neural networks (DNNs) underpin many machine learning applications....
research
09/22/2022

Layer Freezing Data Sieving: Missing Pieces of a Generic Framework for Sparse Training

Recently, sparse training has emerged as a promising paradigm for effici...
research
09/18/2020

X-DC: Explainable Deep Clustering based on Learnable Spectrogram Templates

Deep neural networks (DNNs) have achieved substantial predictive perform...

Please sign up or login with your details

Forgot password? Click here to reset