Regularizing CNNs with Locally Constrained Decorrelations

11/07/2016
by   Pau Rodríguez, et al.
0

Regularization is key for deep learning since it allows training more complex models while keeping lower levels of overfitting. However, the most prevalent regularizations do not leverage all the capacity of the models since they rely on reducing the effective number of parameters. Feature decorrelation is an alternative for using the full capacity of the models but the overfitting reduction margins are too narrow given the overhead it introduces. In this paper, we show that regularizing negatively correlated features is an obstacle for effective decorrelation and present OrthoReg, a novel regularization technique that locally enforces feature orthogonality. As a result, imposing locality constraints in feature decorrelation removes interferences between negatively correlated feature weights, allowing the regularizer to reach higher decorrelation bounds, and reducing the overfitting more effectively. In particular, we show that the models regularized with OrthoReg have higher accuracy bounds even when batch normalization and dropout are present. Moreover, since our regularization is directly performed on the weights, it is especially suitable for fully convolutional neural networks, where the weight space is constant compared to the feature map space. As a result, we are able to reduce the overfitting of state-of-the-art CNNs on CIFAR-10, CIFAR-100, and SVHN.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/08/2020

Passive Batch Injection Training Technique: Boosting Network Performance by Injecting Mini-Batches from a different Data Distribution

This work presents a novel training technique for deep neural networks t...
research
02/21/2020

Exploiting the Full Capacity of Deep Neural Networks while Avoiding Overfitting by Targeted Sparsity Regularization

Overfitting is one of the most common problems when training deep neural...
research
11/14/2018

Drop-Activation: Implicit Parameter Reduction and Harmonic Regularization

Overfitting frequently occurs in deep learning. In this paper, we propos...
research
08/06/2020

Benign Overfitting and Noisy Features

Modern machine learning often operates in the regime where the number of...
research
04/06/2019

Effective and Efficient Dropout for Deep Convolutional Neural Networks

Machine-learning-based data-driven applications have become ubiquitous, ...
research
01/24/2020

Stochastic Optimization of Plain Convolutional Neural Networks with Simple methods

Convolutional neural networks have been achieving the best possible accu...
research
07/04/2022

Counterbalancing Teacher: Regularizing Batch Normalized Models for Robustness

Batch normalization (BN) is a ubiquitous technique for training deep neu...

Please sign up or login with your details

Forgot password? Click here to reset