On Periodic Functions as Regularizers for Quantization of Neural Networks

11/24/2018
by   Maxim Naumov, et al.
0

Deep learning models have been successfully used in computer vision and many other fields. We propose an unorthodox algorithm for performing quantization of the model parameters. In contrast with popular quantization schemes based on thresholds, we use a novel technique based on periodic functions, such as continuous trigonometric sine or cosine as well as non-continuous hat functions. We apply these functions component-wise and add the sum over the model parameters as a regularizer to the model loss during training. The frequency and amplitude hyper-parameters of these functions can be adjusted during training. The regularization pushes the weights into discrete points that can be encoded as integers. We show that using this technique the resulting quantized models exhibit the same accuracy as the original ones on CIFAR-10 and ImageNet datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/04/2019

SinReQ: Generalized Sinusoidal Regularization for Automatic Low-Bitwidth Deep Quantized Training

Quantization of neural networks offers significant promise in reducing t...
research
01/19/2018

BinaryRelax: A Relaxation Approach For Training Deep Neural Networks With Quantized Weights

We propose BinaryRelax, a simple two-phase algorithm, for training deep ...
research
02/18/2020

Gradient ℓ_1 Regularization for Quantization Robustness

We analyze the effect of quantizing weights and activations of neural ne...
research
02/25/2020

Optimal Gradient Quantization Condition for Communication-Efficient Distributed Training

The communication of gradients is costly for training deep neural networ...
research
05/24/2019

Additive Noise Annealing and Approximation Properties of Quantized Neural Networks

We present a theoretical and experimental investigation of the quantizat...
research
12/24/2022

Hyperspherical Loss-Aware Ternary Quantization

Most of the existing works use projection functions for ternary quantiza...

Please sign up or login with your details

Forgot password? Click here to reset