Exploiting the Full Capacity of Deep Neural Networks while Avoiding Overfitting by Targeted Sparsity Regularization

02/21/2020
by   Karim Huesmann, et al.
57

Overfitting is one of the most common problems when training deep neural networks on comparatively small datasets. Here, we demonstrate that neural network activation sparsity is a reliable indicator for overfitting which we utilize to propose novel targeted sparsity visualization and regularization strategies. Based on these strategies we are able to understand and counteract overfitting caused by activation sparsity and filter correlation in a targeted layer-by-layer manner. Our results demonstrate that targeted sparsity regularization can efficiently be used to regularize well-known datasets and architectures with a significant increase in image classification performance while outperforming both dropout and batch normalization. Ultimately, our study reveals novel insights into the contradicting concepts of activation sparsity and network capacity by demonstrating that targeted sparsity regularization enables salient and discriminative feature learning while exploiting the full capacity of deep models without suffering from overfitting, even when trained excessively.

READ FULL TEXT

page 3

page 5

page 7

page 10

research
04/13/2021

The Impact of Activation Sparsity on Overfitting in Convolutional Neural Networks

Overfitting is one of the fundamental challenges when training convoluti...
research
03/07/2020

AL2: Progressive Activation Loss for Learning General Representations in Classification Neural Networks

The large capacity of neural networks enables them to learn complex func...
research
04/17/2019

Sparseout: Controlling Sparsity in Deep Networks

Dropout is commonly used to help reduce overfitting in deep neural netwo...
research
06/28/2020

Layer Sparsity in Neural Networks

Sparsity has become popular in machine learning, because it can save com...
research
11/07/2016

Regularizing CNNs with Locally Constrained Decorrelations

Regularization is key for deep learning since it allows training more co...
research
11/29/2018

On Implicit Filter Level Sparsity in Convolutional Neural Networks

We investigate filter level sparsity that emerges in convolutional neura...
research
11/11/2022

Multilevel-in-Layer Training for Deep Neural Network Regression

A common challenge in regression is that for many problems, the degrees ...

Please sign up or login with your details

Forgot password? Click here to reset