Random Bias Initialization Improving Binary Neural Network Training

09/30/2019
by   Xinlin Li, et al.
0

Edge intelligence especially binary neural network (BNN) has attracted considerable attention of the artificial intelligence community recently. BNNs significantly reduce the computational cost, model size, and memory footprint. However, there is still a performance gap between the successful full-precision neural network with ReLU activation and BNNs. We argue that the accuracy drop of BNNs is due to their geometry. We analyze the behaviour of the full-precision neural network with ReLU activation and compare it with its binarized counterpart. This comparison suggests random bias initialization as a remedy to activation saturation in full-precision networks and leads us towards an improved BNN training. Our numerical experiments confirm our geometric intuition.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/31/2021

Fast Certified Robust Training via Better Initialization and Shorter Warmup

Recently, bound propagation based certified adversarial defense have bee...
research
05/15/2023

ReLU soothes the NTK condition number and accelerates optimization for wide neural networks

Rectified linear unit (ReLU), as a non-linear activation function, is we...
research
05/25/2023

Neural Characteristic Activation Value Analysis for Improved ReLU Network Feature Learning

We examine the characteristic activation values of individual ReLU units...
research
06/23/2021

Numerical influence of ReLU'(0) on backpropagation

In theory, the choice of ReLU'(0) in [0, 1] for a neural network has a n...
research
11/12/2022

MixBin: Towards Budgeted Binarization

Binarization has proven to be amongst the most effective ways of neural ...
research
02/08/2021

Enabling Binary Neural Network Training on the Edge

The ever-growing computational demands of increasingly complex machine l...
research
10/02/2018

Simultaneously Optimizing Weight and Quantizer of Ternary Neural Network using Truncated Gaussian Approximation

In the past years, Deep convolution neural network has achieved great su...

Please sign up or login with your details

Forgot password? Click here to reset