Fast Conditional Network Compression Using Bayesian HyperNetworks

05/13/2022
by   Phuoc Nguyen, et al.
0

We introduce a conditional compression problem and propose a fast framework for tackling it. The problem is how to quickly compress a pretrained large neural network into optimal smaller networks given target contexts, e.g. a context involving only a subset of classes or a context where only limited compute resource is available. To solve this, we propose an efficient Bayesian framework to compress a given large network into much smaller size tailored to meet each contextual requirement. We employ a hypernetwork to parameterize the posterior distribution of weights given conditional inputs and minimize a variational objective of this Bayesian neural network. To further reduce the network sizes, we propose a new input-output group sparsity factorization of weights to encourage more sparseness in the generated weights. Our methods can quickly generate compressed networks with significantly smaller sizes than baseline methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/13/2018

Parametric generation of conditional geological realizations using generative neural networks

We introduce a method for parametric generation of conditional geologica...
research
11/08/2018

Practical Bayesian Learning of Neural Networks via Adaptive Subgradient Methods

We introduce a novel framework for the estimation of the posterior distr...
research
12/06/2021

Fast Test Input Generation for Finding Deviated Behaviors in Compressed Deep Neural Network

Model compression can significantly reduce sizes of deep neural network ...
research
12/01/1996

Exploiting Causal Independence in Bayesian Network Inference

A new method is proposed for exploiting causal independencies in exact B...
research
06/11/2022

A Theoretical Understanding of Neural Network Compression from Sparse Linear Approximation

The goal of model compression is to reduce the size of a large neural ne...
research
05/24/2017

Bayesian Compression for Deep Learning

Compression and computational efficiency in deep learning have become a ...
research
01/11/2021

Preconditioned training of normalizing flows for variational inference in inverse problems

Obtaining samples from the posterior distribution of inverse problems wi...

Please sign up or login with your details

Forgot password? Click here to reset