Adjustable Bounded Rectifiers: Towards Deep Binary Representations

11/19/2015
by   Zhirong Wu, et al.
0

Binary representation is desirable for its memory efficiency, computation speed and robustness. In this paper, we propose adjustable bounded rectifiers to learn binary representations for deep neural networks. While hard constraining representations across layers to be binary makes training unreasonably difficult, we softly encourage activations to diverge from real values to binary by approximating step functions. Our final representation is completely binary. We test our approach on MNIST, CIFAR10, and ILSVRC2012 dataset, and systematically study the training dynamics of the binarization process. Our approach can binarize the last layer representation without loss of performance and binarize all the layers with reasonably small degradations. The memory space that it saves may allow more sophisticated models to be deployed, thus compensating the loss. To the best of our knowledge, this is the first work to report results on current deep network architectures using complete binary middle representations. Given the learned representations, we find that the firing or inhibition of a binary neuron is usually associated with a meaningful interpretation across different classes. This suggests that the semantic structure of a neural network may be manifested through a guided binarization process.

READ FULL TEXT

page 8

page 9

research
09/27/2018

Learning to Train a Binary Neural Network

Convolutional neural networks have achieved astonishing results in diffe...
research
04/05/2020

On Tractable Representations of Binary Neural Networks

We consider the compilation of a binary neural network's decision functi...
research
09/26/2019

Balanced Binary Neural Networks with Gated Residual

Binary neural networks have attracted numerous attention in recent years...
research
02/02/2018

Intriguing Properties of Randomly Weighted Networks: Generalizing While Learning Next to Nothing

Training deep neural networks results in strong learned representations ...
research
05/19/2022

HyBNN and FedHyBNN: (Federated) Hybrid Binary Neural Networks

Binary Neural Networks (BNNs), neural networks with weights and activati...
research
09/13/2021

Explaining Deep Learning Representations by Tracing the Training Process

We propose a novel explanation method that explains the decisions of a d...
research
02/20/2018

i-RevNet: Deep Invertible Networks

It is widely believed that the success of deep convolutional networks is...

Please sign up or login with your details

Forgot password? Click here to reset