MeliusNet: Can Binary Neural Networks Achieve MobileNet-level Accuracy?

01/16/2020
by   Joseph Bethge, et al.
0

Binary Neural Networks (BNNs) are neural networks which use binary weights and activations instead of the typical 32-bit floating point values. They have reduced model sizes and allow for efficient inference on mobile or embedded devices with limited power and computational resources. However, the binarization of weights and activations leads to feature maps of lower quality and lower capacity and thus a drop in accuracy compared to traditional networks. Previous work has increased the number of channels or used multiple binary bases to alleviate these problems. In this paper, we instead present MeliusNet consisting of alternating two block designs, which consecutively increase the number of features and then improve the quality of these features. In addition, we propose a redesign of those layers that use 32-bit values in previous approaches to reduce the required number of operations. Experiments on the ImageNet dataset demonstrate the superior performance of our MeliusNet over a variety of popular binary architectures with regards to both computation savings and accuracy. Furthermore, with our method we trained BNN models, which for the first time can match the accuracy of the popular compact network MobileNet in terms of model size and accuracy. Our code is published online: https://github.com/hpi-xnor/BMXNet-v2

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/08/2018

Training Compact Neural Networks with Binary Weights and Low Precision Activations

In this paper, we propose to train a network with binary weights and low...
research
10/19/2020

Bi-Real Net V2: Rethinking Non-linearity for 1-bit CNNs and Going Beyond

Binary neural networks (BNNs), where both weights and activations are bi...
research
09/16/2019

Searching for Accurate Binary Neural Architectures

Binary neural networks have attracted tremendous attention due to the ef...
research
10/06/2022

IR2Net: Information Restriction and Information Recovery for Accurate Binary Neural Networks

Weight and activation binarization can efficiently compress deep neural ...
research
04/04/2022

Soft Threshold Ternary Networks

Large neural networks are difficult to deploy on mobile devices because ...
research
07/10/2020

Distillation Guided Residual Learning for Binary Convolutional Neural Networks

It is challenging to bridge the performance gap between Binary CNN (BCNN...
research
10/13/2022

Parameter-Efficient Masking Networks

A deeper network structure generally handles more complicated non-linear...

Please sign up or login with your details

Forgot password? Click here to reset