Binary Ensemble Neural Network: More Bits per Network or More Networks per Bit?

06/20/2018
by   Shilin Zhu, et al.
0

Binary neural networks (BNN) have been studied extensively since they run dramatically faster at lower memory and power consumption than floating-point networks, thanks to the efficiency of bit operations. However, contemporary BNNs whose weights and activations are both single bits suffer from severe accuracy degradation. To understand why, we investigate the representation ability, speed and bias/variance of BNNs through extensive experiments. We conclude that the error of BNNs is predominantly caused by the intrinsic instability (training time) and non-robustness (train & test time). Inspired by this investigation, we propose the Binary Ensemble Neural Network (BENN) which leverages ensemble methods to improve the performance of BNNs with limited efficiency cost. While ensemble techniques have been broadly believed to be only marginally helpful for strong classifiers such as deep neural networks, our analyses and experiments show that they are naturally a perfect fit to boost BNNs. We find that our BENN, which is faster and much more robust than state-of-the-art binary networks, can even surpass the accuracy of the full-precision floating number network with the same architecture.

READ FULL TEXT
research
06/06/2022

8-bit Numerical Formats for Deep Neural Networks

Given the current trend of increasing size and complexity of machine lea...
research
05/19/2022

HyBNN and FedHyBNN: (Federated) Hybrid Binary Neural Networks

Binary Neural Networks (BNNs), neural networks with weights and activati...
research
09/10/2018

Probabilistic Binary Neural Networks

Low bit-width weights and activations are an effective way of combating ...
research
04/03/2023

Optimizing data-flow in Binary Neural Networks

Binary Neural Networks (BNNs) can significantly accelerate the inference...
research
10/07/2021

Ensemble Neural Representation Networks

Implicit Neural Representation (INR) has recently attracted considerable...
research
01/30/2023

Self-Compressing Neural Networks

This work focuses on reducing neural network size, which is a major driv...
research
02/19/2022

Bit-wise Training of Neural Network Weights

We introduce an algorithm where the individual bits representing the wei...

Please sign up or login with your details

Forgot password? Click here to reset