Self-Compression in Bayesian Neural Networks

11/10/2021
by   Giuseppina Carannante, et al.
0

Machine learning models have achieved human-level performance on various tasks. This success comes at a high cost of computation and storage overhead, which makes machine learning algorithms difficult to deploy on edge devices. Typically, one has to partially sacrifice accuracy in favor of an increased performance quantified in terms of reduced memory usage and energy consumption. Current methods compress the networks by reducing the precision of the parameters or by eliminating redundant ones. In this paper, we propose a new insight into network compression through the Bayesian framework. We show that Bayesian neural networks automatically discover redundancy in model parameters, thus enabling self-compression, which is linked to the propagation of uncertainty through the layers of the network. Our experimental results show that the network architecture can be successfully compressed by deleting parameters identified by the network itself while retaining the same level of accuracy.

READ FULL TEXT

page 2

page 4

research
06/08/2020

EDCompress: Energy-Aware Model Compression with Dataflow

Edge devices demand low energy consumption, cost and small form factor. ...
research
05/06/2022

Online Model Compression for Federated Learning with Large Models

This paper addresses the challenges of training large neural network mod...
research
11/12/2021

Nonlinear Tensor Ring Network

The state-of-the-art deep neural networks (DNNs) have been widely applie...
research
06/28/2019

RECURSIA-RRT: Recursive translatable point-set pattern discovery with removal of redundant translators

Two algorithms, RECURSIA and RRT, are presented, designed to increase th...
research
06/23/2020

On Compression Principle and Bayesian Optimization for Neural Networks

Finding methods for making generalizable predictions is a fundamental pr...
research
02/01/2019

Efficient Hybrid Network Architectures for Extremely Quantized Neural Networks Enabling Intelligence at the Edge

The recent advent of `Internet of Things' (IOT) has increased the demand...
research
11/25/2019

Structured Multi-Hashing for Model Compression

Despite the success of deep neural networks (DNNs), state-of-the-art mod...

Please sign up or login with your details

Forgot password? Click here to reset