Boosting Binary Neural Networks via Dynamic Thresholds Learning

11/04/2022
by   Jiehua Zhang, et al.
0

Developing lightweight Deep Convolutional Neural Networks (DCNNs) and Vision Transformers (ViTs) has become one of the focuses in vision research since the low computational cost is essential for deploying vision models on edge devices. Recently, researchers have explored highly computational efficient Binary Neural Networks (BNNs) by binarizing weights and activations of Full-precision Neural Networks. However, the binarization process leads to an enormous accuracy gap between BNN and its full-precision version. One of the primary reasons is that the Sign function with predefined or learned static thresholds limits the representation capacity of binarized architectures since single-threshold binarization fails to utilize activation distributions. To overcome this issue, we introduce the statistics of channel information into explicit thresholds learning for the Sign Function dubbed DySign to generate various thresholds based on input distribution. Our DySign is a straightforward method to reduce information loss and boost the representative capacity of BNNs, which can be flexibly applied to both DCNNs and ViTs (i.e., DyBCNN and DyBinaryCCT) to achieve promising performance improvement. As shown in our extensive experiments. For DCNNs, DyBCNNs based on two backbones (MobileNetV1 and ResNet18) achieve 71.2 outperforming baselines by a large margin (i.e., 1.8 For ViTs, DyBinaryCCT presents the superiority of the convolutional embedding layer in fully binarized ViTs and achieves 56.1 is nearly 9

READ FULL TEXT

page 1

page 2

page 6

page 9

page 10

research
10/08/2021

Dynamic Binary Neural Network by learning channel-wise thresholds

Binary neural networks (BNNs) constrain weights and activations to +1 or...
research
03/03/2021

Self-Distribution Binary Neural Networks

In this work, we study the binary neural networks (BNNs) of which both t...
research
03/01/2021

Learning Frequency Domain Approximation for Binary Neural Networks

Binary neural networks (BNNs) represent original full-precision weights ...
research
03/25/2011

Using Variable Threshold to Increase Capacity in a Feedback Neural Network

The article presents new results on the use of variable thresholds to in...
research
05/24/2023

BinaryViT: Towards Efficient and Accurate Binary Vision Transformers

Vision Transformers (ViTs) have emerged as the fundamental architecture ...
research
11/14/2022

BiViT: Extremely Compressed Binary Vision Transformer

Model binarization can significantly compress model size, reduce energy ...
research
05/09/2021

Binarized Weight Error Networks With a Transition Regularization Term

This paper proposes a novel binarized weight network (BT) for a resource...

Please sign up or login with your details

Forgot password? Click here to reset