ReActNet: Towards Precise Binary Neural Network with Generalized Activation Functions

03/07/2020
by   Zechun Liu, et al.
0

In this paper, we propose several ideas for enhancing a binary network to close its accuracy gap from real-valued networks without incurring any additional computational cost. We first construct a baseline network by modifying and binarizing a compact real-valued network with parameter-free shortcuts, bypassing all the intermediate convolutional layers including the downsampling layers. This baseline network strikes a good trade-off between accuracy and efficiency, achieving superior performance than most of existing binary networks at approximately half of the computational cost. Through extensive experiments and analysis, we observed that the performance of binary networks is sensitive to activation distribution variations. Based on this important observation, we propose to generalize the traditional Sign and PReLU functions, denoted as RSign and RPReLU for the respective generalized functions, to enable explicit learning of the distribution reshape and shift at near-zero extra cost. Lastly, we adopt a distributional loss to further enforce the binary network to learn similar output distributions as those of a real-valued network. We show that after incorporating all these ideas, the proposed ReActNet outperforms all the state-of-the-arts by a large margin. Specifically, it outperforms Real-to-Binary Net and MeliusNet29 by 4.0 3.6 real-valued counterpart to within 3.0

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/01/2018

Bi-Real Net: Enhancing the Performance of 1-bit CNNs With Improved Representational Capability and Advanced Training Algorithm

In this work, we study the 1-bit convolutional neural networks (CNNs), o...
research
10/07/2020

High-Capacity Expert Binary Networks

Network binarization is a promising hardware-aware direction for creatin...
research
03/25/2020

Training Binary Neural Networks with Real-to-Binary Convolutions

This paper shows how to train binary networks to within a few percent po...
research
09/13/2017

Flexible Network Binarization with Layer-wise Priority

How to effectively approximate real-valued parameters with binary codes ...
research
11/04/2018

Bi-Real Net: Binarizing Deep Network Towards Real-Network Performance

In this paper, we study 1-bit convolutional neural networks (CNNs), of w...
research
10/28/2021

Learning Aggregations of Binary Activated Neural Networks with Probabilities over Representations

Considering a probability distribution over parameters is known as an ef...
research
11/13/2018

Domain Agnostic Real-Valued Specificity Prediction

Sentence specificity quantifies the level of detail in a sentence, chara...

Please sign up or login with your details

Forgot password? Click here to reset