Network Compression via Recursive Bayesian Pruning

12/02/2018
by   Yuefu Zhou, et al.
0

Recently, compression and acceleration of deep neural networks are in critic need. Bayesian generalization of structured pruning represents an important research direction to solve the above problem. However, the existing Bayesian methods ignore the dependency among neurons and filters for computational simplicity. In this study, we explore, under Bayesian framework, a structured pruning method with layer-wise sequential dependency assumed, a more general learning setting. Based on the property of Dirac distribution, we further derive a new dropout noise, which makes it possible to approximate the posterior of dropout noise knowing that of the previous layer. With the Dirac-like dropout noise, we further propose a recursive strategy, named Recursive Bayesian Pruning (RBP), to train and prune networks in a layer-by-layer fashion. The unimportant neurons and filters are directly targeted and removed, taking the influence from the previous layer. Experiments on typical neural networks LeNet-300-100, LeNet-5 and VGG-16 have demonstrated the proposed method are competitive with or even outperform the state-of-the-art methods in several compression and acceleration metrics.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/19/2020

Data-Independent Structured Pruning of Neural Networks via Coresets

Model compression is crucial for deployment of neural networks on device...
research
05/20/2017

Structured Bayesian Pruning via Log-Normal Multiplicative Noise

Dropout-based regularization methods can be regarded as injecting random...
research
02/21/2023

Structured Bayesian Compression for Deep Neural Networks Based on The Turbo-VBI Approach

With the growth of neural network size, model compression has attracted ...
research
05/31/2021

RED : Looking for Redundancies for Data-Free Structured Compression of Deep Neural Networks

Deep Neural Networks (DNNs) are ubiquitous in today's computer vision la...
research
02/16/2021

Improving Bayesian Inference in Deep Neural Networks with Variational Structured Dropout

Approximate inference in deep Bayesian networks exhibits a dilemma of ho...
research
06/07/2020

EDropout: Energy-Based Dropout and Pruning of Deep Neural Networks

Dropout is a well-known regularization method by sampling a sub-network ...
research
05/10/2022

Robust Learning of Parsimonious Deep Neural Networks

We propose a simultaneous learning and pruning algorithm capable of iden...

Please sign up or login with your details

Forgot password? Click here to reset