LDMNet: Low Dimensional Manifold Regularized Neural Networks

11/16/2017
by   Wei Zhu, et al.
0

Deep neural networks have proved very successful on archetypal tasks for which large training sets are available, but when the training data are scarce, their performance suffers from overfitting. Many existing methods of reducing overfitting are data-independent, and their efficacy is often limited when the training set is very small. Data-dependent regularizations are mostly motivated by the observation that data of interest lie close to a manifold, which is typically hard to parametrize explicitly and often requires human input of tangent vectors. These methods typically only focus on the geometry of the input data, and do not necessarily encourage the networks to produce geometrically meaningful features. To resolve this, we propose a new framework, the Low-Dimensional-Manifold-regularized neural Network (LDMNet), which incorporates a feature regularization method that focuses on the geometry of both the input data and the output features. In LDMNet, we regularize the network by encouraging the combination of the input data and the output features to sample a collection of low dimensional manifolds, which are searched efficiently without explicit parametrization. To achieve this, we directly use the manifold dimension as a regularization term in a variational functional. The resulting Euler-Lagrange equation is a Laplace-Beltrami equation over a point cloud, which is solved by the point integral method without increasing the computational complexity. We demonstrate two benefits of LDMNet in the experiments. First, we show that LDMNet significantly outperforms widely-used network regularizers such as weight decay and DropOut. Second, we show that LDMNet can be designed to extract common features of an object imaged via different modalities, which proves to be very useful in real-world applications such as cross-spectral face recognition.

READ FULL TEXT
research
12/21/2015

GraphConnect: A Regularization Framework for Neural Networks

Deep neural networks have proved very successful in domains where large ...
research
12/01/2022

The Effect of Data Dimensionality on Neural Network Prunability

Practitioners prune neural networks for efficiency gains and generalizat...
research
06/26/2023

Effective Minkowski Dimension of Deep Nonparametric Regression: Function Approximation and Statistical Theories

Existing theories on deep nonparametric regression have shown that when ...
research
05/30/2016

Stochastic Function Norm Regularization of Deep Networks

Deep neural networks have had an enormous impact on image analysis. Stat...
research
12/17/2021

A singular Riemannian geometry approach to Deep Neural Networks II. Reconstruction of 1-D equivalence classes

In a previous work, we proposed a geometric framework to study a deep ne...
research
05/18/2018

Stop memorizing: A data-dependent regularization framework for intrinsic pattern learning

Deep neural networks (DNNs) typically have enough capacity to fit random...
research
05/02/2023

The Training Process of Many Deep Networks Explores the Same Low-Dimensional Manifold

We develop information-geometric techniques to analyze the trajectories ...

Please sign up or login with your details

Forgot password? Click here to reset