Weight Compander: A Simple Weight Reparameterization for Regularization

06/29/2023
by   Rinor Cakaj, et al.
0

Regularization is a set of techniques that are used to improve the generalization ability of deep neural networks. In this paper, we introduce weight compander (WC), a novel effective method to improve generalization by reparameterizing each weight in deep neural networks using a nonlinear function. It is a general, intuitive, cheap and easy to implement method, which can be combined with various other regularization techniques. Large weights in deep neural networks are a sign of a more complex network that is overfitted to the training data. Moreover, regularized networks tend to have a greater range of weights around zero with fewer weights centered at zero. We introduce a weight reparameterization function which is applied to each weight and implicitly reduces overfitting by restricting the magnitude of the weights while forcing them away from zero at the same time. This leads to a more democratic decision-making in the network. Firstly, individual weights cannot have too much influence in the prediction process due to the restriction of their magnitude. Secondly, more weights are used in the prediction process, since they are forced away from zero during the training. This promotes the extraction of more features from the input data and increases the level of weight redundancy, which makes the network less sensitive to statistical differences between training and test data. We extend our method to learn the hyperparameters of the introduced weight reparameterization function. This avoids hyperparameter search and gives the network the opportunity to align the weight reparameterization with the training progress. We show experimentally that using weight compander in addition to standard regularization methods improves the performance of neural networks.

READ FULL TEXT

page 1

page 8

research
08/14/2023

HyperSparse Neural Networks: Shifting Exploration to Exploitation through Adaptive Regularization

Sparse neural networks are a key factor in developing resource-efficient...
research
03/25/2020

Volumization as a Natural Generalization of Weight Decay

We propose a novel regularization method, called volumization, for neura...
research
09/03/2020

It's Hard for Neural Networks To Learn the Game of Life

Efforts to improve the learning abilities of neural networks have focuse...
research
06/06/2020

MMA Regularization: Decorrelating Weights of Neural Networks by Maximizing the Minimal Angles

The strong correlation between neurons or filters can significantly weak...
research
04/23/2020

Roof material classification from aerial imagery

This paper describes an algorithm for classification of roof materials u...
research
05/29/2019

Learning the Non-linearity in Convolutional Neural Networks

We propose the introduction of nonlinear operation into the feature gene...
research
11/14/2020

AutoRWN: automatic construction and training of random weight networks using competitive swarm of agents

Random Weight Networks have been extensively used in many applications i...

Please sign up or login with your details

Forgot password? Click here to reset