How Do Adam and Training Strategies Help BNNs Optimization?

06/21/2021
by   Zechun Liu, et al.
1

The best performing Binary Neural Networks (BNNs) are usually attained using Adam optimization and its multi-step training variants. However, to the best of our knowledge, few studies explore the fundamental reasons why Adam is superior to other optimizers like SGD for BNN optimization or provide analytical explanations that support specific training strategies. To address this, in this paper we first investigate the trajectories of gradients and weights in BNNs during the training process. We show the regularization effect of second-order momentum in Adam is crucial to revitalize the weights that are dead due to the activation saturation in BNNs. We find that Adam, through its adaptive learning rate strategy, is better equipped to handle the rugged loss surface of BNNs and reaches a better optimum with higher generalization ability. Furthermore, we inspect the intriguing role of the real-valued weights in binary networks, and reveal the effect of weight decay on the stability and sluggishness of BNN optimization. Through extensive experiments and analysis, we derive a simple training scheme, building on existing Adam-based optimization, which achieves 70.5 the same architecture as the state-of-the-art ReActNet while achieving 1.1 higher accuracy. Code and models are available at https://github.com/liuzechun/AdamBNN.

READ FULL TEXT
research
06/05/2019

Latent Weights Do Not Exist: Rethinking Binarized Neural Network Optimization

Optimization of Binarized Neural Networks (BNNs) currently relies on rea...
research
11/14/2017

Fixing Weight Decay Regularization in Adam

We note that common implementations of adaptive gradient algorithms, suc...
research
10/29/2018

Three Mechanisms of Weight Decay Regularization

Weight decay is one of the standard tricks in the neural network toolbox...
research
09/04/2022

Recurrent Bilinear Optimization for Binary Neural Networks

Binary Neural Networks (BNNs) show great promise for real-world embedded...
research
03/23/2021

ReCU: Reviving the Dead Weights in Binary Neural Networks

Binary neural networks (BNNs) have received increasing attention due to ...
research
01/11/2022

Optimization Planning for 3D ConvNets

It is not trivial to optimally learn a 3D Convolutional Neural Networks ...
research
02/02/2023

Resilient Binary Neural Network

Binary neural networks (BNNs) have received ever-increasing popularity f...

Please sign up or login with your details

Forgot password? Click here to reset