ESPN: Extremely Sparse Pruned Networks

06/28/2020
by   Minsu Cho, et al.
17

Deep neural networks are often highly overparameterized, prohibiting their use in compute-limited systems. However, a line of recent works has shown that the size of deep networks can be considerably reduced by identifying a subset of neuron indicators (or mask) that correspond to significant weights prior to training. We demonstrate that an simple iterative mask discovery method can achieve state-of-the-art compression of very deep networks. Our algorithm represents a hybrid approach between single shot network pruning methods (such as SNIP) with Lottery-Ticket type approaches. We validate our approach on several datasets and outperform several existing pruning approaches in both test accuracy and compression ratio.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/21/2023

Structured Bayesian Compression for Deep Neural Networks Based on The Turbo-VBI Approach

With the growth of neural network size, model compression has attracted ...
research
07/21/2017

Neuron Pruning for Compressing Deep Networks using Maxout Architectures

This paper presents an efficient and robust approach for reducing the si...
research
06/22/2020

Rapid Structural Pruning of Neural Networks with Set-based Task-Adaptive Meta-Pruning

As deep neural networks are growing in size and being increasingly deplo...
research
11/23/2020

Synthesis and Pruning as a Dynamic Compression Strategy for Efficient Deep Neural Networks

The brain is a highly reconfigurable machine capable of task-specific ad...
research
04/30/2020

Pruning artificial neural networks: a way to find well-generalizing, high-entropy sharp minima

Recently, a race towards the simplification of deep networks has begun, ...
research
08/19/2023

HollowNeRF: Pruning Hashgrid-Based NeRFs with Trainable Collision Mitigation

Neural radiance fields (NeRF) have garnered significant attention, with ...
research
02/11/2023

Pruning Deep Neural Networks from a Sparsity Perspective

In recent years, deep network pruning has attracted significant attentio...

Please sign up or login with your details

Forgot password? Click here to reset