Bayesian Compression for Deep Learning

05/24/2017
by   Christos Louizos, et al.
0

Compression and computational efficiency in deep learning have become a problem of great significance. In this work, we argue that the most principled and effective way to attack this problem is by adopting a Bayesian point of view, where through sparsity inducing priors we prune large parts of the network. We introduce two novelties in this paper: 1) we use hierarchical priors to prune nodes instead of individual weights, and 2) we use the posterior uncertainties to determine the optimal fixed point precision to encode the weights. Both factors significantly contribute to achieving the state of the art in terms of compression rates, while still staying competitive with methods designed to optimize for speed or energy efficiency.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/17/2023

A comprehensive study of spike and slab shrinkage priors for structurally sparse Bayesian neural networks

Network complexity and computational efficiency have become increasingly...
research
02/13/2017

Soft Weight-Sharing for Neural Network Compression

The success of deep learning in numerous application domains created the...
research
09/21/2023

Bayesian sparsification for deep neural networks with Bayesian model reduction

Deep learning's immense capabilities are often constrained by the comple...
research
01/21/2022

APack: Off-Chip, Lossless Data Compression for Efficient Deep Learning Inference

Data accesses between on- and off-chip memories account for a large frac...
research
05/13/2022

Fast Conditional Network Compression Using Bayesian HyperNetworks

We introduce a conditional compression problem and propose a fast framew...
research
06/21/2021

Unsupervised Deep Learning by Injecting Low-Rank and Sparse Priors

What if deep neural networks can learn from sparsity-inducing priors? Wh...
research
12/02/2021

Invariant Priors for Bayesian Quadrature

Bayesian quadrature (BQ) is a model-based numerical integration method t...

Please sign up or login with your details

Forgot password? Click here to reset