Efficient Stochastic Inference of Bitwise Deep Neural Networks

11/20/2016
by   Sebastian Vogel, et al.
0

Recently published methods enable training of bitwise neural networks which allow reduced representation of down to a single bit per weight. We present a method that exploits ensemble decisions based on multiple stochastically sampled network models to increase performance figures of bitwise neural networks in terms of classification accuracy at inference. Our experiments with the CIFAR-10 and GTSRB datasets show that the performance of such network ensembles surpasses the performance of the high-precision base model. With this technique we achieve 5.81 bitwise networks. Concerning inference on embedded systems we evaluate these bitwise networks using a hardware efficient stochastic rounding procedure. Our work contributes to efficient embedded bitwise neural networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/23/2018

Training wide residual networks for deployment using a single bit for each weight

For fast and energy-efficient deployment of trained deep neural networks...
research
03/24/2021

A Simple and Efficient Stochastic Rounding Method for Training Neural Networks in Low Precision

Conventional stochastic rounding (CSR) is widely employed in the trainin...
research
10/24/2022

OLLA: Decreasing the Memory Usage of Neural Networks by Optimizing the Lifetime and Location of Arrays

The size of deep neural networks has grown exponentially in recent years...
research
09/19/2017

Compressing Low Precision Deep Neural Networks Using Sparsity-Induced Regularization in Ternary Networks

A low precision deep neural network training technique for producing spa...
research
02/19/2012

Classification by Ensembles of Neural Networks

We introduce a new procedure for training of artificial neural networks ...
research
06/12/2017

Confident Multiple Choice Learning

Ensemble methods are arguably the most trustworthy techniques for boosti...
research
07/19/2016

Runtime Configurable Deep Neural Networks for Energy-Accuracy Trade-off

We present a novel dynamic configuration technique for deep neural netwo...

Please sign up or login with your details

Forgot password? Click here to reset