GraphConnect: A Regularization Framework for Neural Networks

12/21/2015
by   Jiaji Huang, et al.
0

Deep neural networks have proved very successful in domains where large training sets are available, but when the number of training samples is small, their performance suffers from overfitting. Prior methods of reducing overfitting such as weight decay, Dropout and DropConnect are data-independent. This paper proposes a new method, GraphConnect, that is data-dependent, and is motivated by the observation that data of interest lie close to a manifold. The new method encourages the relationships between the learned decisions to resemble a graph representing the manifold structure. Essentially GraphConnect is designed to learn attributes that are present in data samples in contrast to weight decay, Dropout and DropConnect which are simply designed to make it more difficult to fit to random error or noise. Empirical Rademacher complexity is used to connect the generalization error of the neural network to spectral properties of the graph learned from the input data. This framework is used to show that GraphConnect is superior to weight decay. Experimental results on several benchmark datasets validate the theoretical analysis, and show that when the number of training samples is small, GraphConnect is able to significantly improve performance over weight decay.

READ FULL TEXT
research
11/16/2017

LDMNet: Low Dimensional Manifold Regularized Neural Networks

Deep neural networks have proved very successful on archetypal tasks for...
research
05/30/2016

Stochastic Function Norm Regularization of Deep Networks

Deep neural networks have had an enormous impact on image analysis. Stat...
research
04/06/2023

Spectral Gap Regularization of Neural Networks

We introduce Fiedler regularization, a novel approach for regularizing n...
research
07/04/2023

Deconstructing Data Reconstruction: Multiclass, Weight Decay and General Losses

Memorization of training data is an active research area, yet our unders...
research
02/20/2018

Do deep nets really need weight decay and dropout?

The impressive success of modern deep neural networks on computer vision...
research
04/16/2011

Adding noise to the input of a model trained with a regularized objective

Regularization is a well studied problem in the context of neural networ...

Please sign up or login with your details

Forgot password? Click here to reset