Weight Pruning via Adaptive Sparsity Loss

06/04/2020
by   George Retsinas, et al.
0

Pruning neural networks has regained interest in recent years as a means to compress state-of-the-art deep neural networks and enable their deployment on resource-constrained devices. In this paper, we propose a robust compressive learning framework that efficiently prunes network parameters during training with minimal computational overhead. We incorporate fast mechanisms to prune individual layers and build upon these to automatically prune the entire network under a user-defined budget constraint. Key to our end-to-end network pruning approach is the formulation of an intuitive and easy-to-implement adaptive sparsity loss that is used to explicitly control sparsity during training, enabling efficient budget-aware optimization. Extensive experiments demonstrate the effectiveness of the proposed framework for image classification on the CIFAR and ImageNet datasets using different architectures, including AlexNet, ResNets and Wide ResNets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/17/2023

Differentiable Transportation Pruning

Deep learning algorithms are increasingly employed at the edge. However,...
research
03/12/2020

SASL: Saliency-Adaptive Sparsity Learning for Neural Network Acceleration

Accelerating the inference speed of CNNs is critical to their deployment...
research
08/08/2022

Controlled Sparsity via Constrained Optimization or: How I Learned to Stop Tuning Penalties and Love Constraints

The performance of trained neural networks is robust to harsh levels of ...
research
02/11/2023

Pruning Deep Neural Networks from a Sparsity Perspective

In recent years, deep network pruning has attracted significant attentio...
research
09/07/2018

SECS: Efficient Deep Stream Processing via Class Skew Dichotomy

Despite that accelerating convolutional neural network (CNN) receives an...
research
06/25/2023

Adaptive Sharpness-Aware Pruning for Robust Sparse Networks

Robustness and compactness are two essential components of deep learning...
research
02/14/2021

ChipNet: Budget-Aware Pruning with Heaviside Continuous Approximations

Structured pruning methods are among the effective strategies for extrac...

Please sign up or login with your details

Forgot password? Click here to reset