CBP: Backpropagation with constraint on weight precision using a pseudo-Lagrange multiplier method

10/06/2021
by   Guhyun Kim, et al.
0

Backward propagation of errors (backpropagation) is a method to minimize objective functions (e.g., loss functions) of deep neural networks by identifying optimal sets of weights and biases. Imposing constraints on weight precision is often required to alleviate prohibitive workloads on hardware. Despite the remarkable success of backpropagation, the algorithm itself is not capable of considering such constraints unless additional algorithms are applied simultaneously. To address this issue, we propose the constrained backpropagation (CBP) algorithm based on a pseudo-Lagrange multiplier method to obtain the optimal set of weights that satisfy a given set of constraints. The defining characteristic of the proposed CBP algorithm is the utilization of a Lagrangian function (loss function plus constraint function) as its objective function. We considered various types of constraints–binary, ternary, one-bit shift, and two-bit shift weight constraints. As a post-training method, CBP applied to AlexNet, ResNet-18, ResNet-50, and GoogLeNet on ImageNet, which were pre-trained using the conventional backpropagation. For all cases, the proposed algorithm outperforms the state-of-the-art methods on ImageNet, e.g., 66.6 74.4 binary weights, respectively. This highlights CBP as a learning algorithm to address diverse constraints with the minimal performance loss by employing appropriate constraint functions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/02/2017

Ternary Neural Networks with Fine-Grained Quantization

We propose a novel fine-grained quantization (FGQ) method to ternarize p...
research
07/07/2021

S^3: Sign-Sparse-Shift Reparametrization for Effective Training of Low-bit Shift Networks

Shift neural networks reduce computation complexity by removing expensiv...
research
09/25/2019

Accurate and Compact Convolutional Neural Networks with Trained Binarization

Although convolutional neural networks (CNNs) are now widely used in var...
research
03/11/2021

Preprint: Norm Loss: An efficient yet effective regularization method for deep neural networks

Convolutional neural network training can suffer from diverse issues lik...
research
03/29/2023

Backpropagation and F-adjoint

This paper presents a concise mathematical framework for investigating b...
research
07/03/2021

Exact Backpropagation in Binary Weighted Networks with Group Weight Transformations

Quantization based model compression serves as high performing and fast ...
research
01/16/2021

Slot Machines: Discovering Winning Combinations of Random Weights in Neural Networks

In contrast to traditional weight optimization in a continuous space, we...

Please sign up or login with your details

Forgot password? Click here to reset