Random Bias Initialization Improving Binary Neural Network Training

09/30/2019
by   Xinlin Li, et al.
0

Edge intelligence especially binary neural network (BNN) has attracted considerable attention of the artificial intelligence community recently. BNNs significantly reduce the computational cost, model size, and memory footprint. However, there is still a performance gap between the successful full-precision neural network with ReLU activation and BNNs. We argue that the accuracy drop of BNNs is due to their geometry. We analyze the behaviour of the full-precision neural network with ReLU activation and compare it with its binarized counterpart. This comparison suggests random bias initialization as a remedy to activation saturation in full-precision networks and leads us towards an improved BNN training. Our numerical experiments confirm our geometric intuition.

READ FULL TEXT

page 1

page 2

page 3

page 4

03/31/2021

Fast Certified Robust Training via Better Initialization and Shorter Warmup

Recently, bound propagation based certified adversarial defense have bee...
09/18/2019

A Study on Binary Neural Networks Initialization

Initialization plays a crucial role in training neural models. Binary Ne...
12/02/2020

Improving Accuracy of Binary Neural Networks using Unbalanced Activation Distribution

Binarization of neural network models is considered as one of the promis...
06/23/2021

Numerical influence of ReLU'(0) on backpropagation

In theory, the choice of ReLU'(0) in [0, 1] for a neural network has a n...
01/05/2020

Cooperative Initialization based Deep Neural Network Training

Researchers have proposed various activation functions. These activation...
02/08/2021

Enabling Binary Neural Network Training on the Edge

The ever-growing computational demands of increasingly complex machine l...
03/27/2019

A Sober Look at Neural Network Initializations

Initializing the weights and the biases is a key part of the training pr...