Growing Efficient Deep Networks by Structured Continuous Sparsification

07/30/2020
by   Xin Yuan, et al.
9

We develop an approach to training deep networks while dynamically adjusting their architecture, driven by a principled combination of accuracy and sparsity objectives. Unlike conventional pruning approaches, our method adopts a gradual continuous relaxation of discrete network structure optimization and then samples sparse subnetworks, enabling efficient deep networks to be trained in a growing and pruning manner. Extensive experiments across CIFAR-10, ImageNet, PASCAL VOC, and Penn Treebank, with convolutional models for image classification and semantic segmentation, and recurrent models for language modeling, show that our training scheme yields efficient networks that are smaller and more accurate than those produced by competing pruning methods.

READ FULL TEXT
research
10/28/2020

Differentiable Channel Pruning Search

In this paper, we propose the differentiable channel pruning search (DCP...
research
08/28/2023

A Generalization of Continuous Relaxation in Structured Pruning

Deep learning harnesses massive parallel floating-point processing to tr...
research
04/06/2019

C2S2: Cost-aware Channel Sparse Selection for Progressive Network Pruning

This paper describes a channel-selection approach for simplifying deep n...
research
04/05/2020

DSA: More Efficient Budgeted Pruning via Differentiable Sparsity Allocation

Budgeted pruning is the problem of pruning under resource constraints. I...
research
01/07/2021

Max-Affine Spline Insights Into Deep Network Pruning

In this paper, we study the importance of pruning in Deep Networks (DNs)...
research
06/08/2023

Magnitude Attention-based Dynamic Pruning

Existing pruning methods utilize the importance of each weight based on ...
research
11/14/2017

Deep Rewiring: Training very sparse deep networks

Neuromorphic hardware tends to pose limits on the connectivity of deep n...

Please sign up or login with your details

Forgot password? Click here to reset