RPR: Random Partition Relaxation for Training; Binary and Ternary Weight Neural Networks

01/04/2020
by   Lukas Cavigelli, et al.
0

We present Random Partition Relaxation (RPR), a method for strong quantization of neural networks weight to binary (+1/-1) and ternary (+1/0/-1) values. Starting from a pre-trained model, we quantize the weights and then relax random partitions of them to their continuous values for retraining before re-quantizing them and switching to another weight partition for further adaptation. We demonstrate binary and ternary-weight networks with accuracies beyond the state-of-the-art for GoogLeNet and competitive performance for ResNet-18 and ResNet-50 using an SGD-based training method that can easily be integrated into existing frameworks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/29/2019

What's Hidden in a Randomly Weighted Neural Network?

Training a neural network is synonymous with learning the values of the ...
research
11/13/2021

Iterative Training: Finding Binary Weight Deep Neural Networks with Layer Binarization

In low-latency or mobile applications, lower computation complexity, low...
research
01/16/2021

Slot Machines: Discovering Winning Combinations of Random Weights in Neural Networks

In contrast to traditional weight optimization in a continuous space, we...
research
07/29/2019

MoBiNet: A Mobile Binary Network for Image Classification

MobileNet and Binary Neural Networks are two among the most widely used ...
research
02/19/2021

A Projection Algorithm for the Unitary Weights

Unitary neural networks are promising alternatives for solving the explo...
research
06/13/2022

Why Quantization Improves Generalization: NTK of Binary Weight Neural Networks

Quantized neural networks have drawn a lot of attention as they reduce t...
research
04/28/2020

Automatic Cross-Replica Sharding of Weight Update in Data-Parallel Training

In data-parallel synchronous training of deep neural networks, different...

Please sign up or login with your details

Forgot password? Click here to reset