Topologically Densified Distributions

02/12/2020
by   Christoph D. Hofer, et al.
6

We study regularization in the context of small sample-size learning with over-parameterized neural networks. Specifically, we shift focus from architectural properties, such as norms on the network weights, to properties of the internal representations before a linear classifier. Specifically, we impose a topological constraint on samples drawn from the probability measure induced in that space. This provably leads to mass concentration effects around the representations of training instances, i.e., a property beneficial for generalization. By leveraging previous work to impose topological constraints in a neural network setting, we provide empirical evidence (across various vision benchmarks) to support our claim for better generalization.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/22/2021

Topological Regularization for Dense Prediction

Dense prediction tasks such as depth perception and semantic segmentatio...
research
01/28/2022

With Greater Distance Comes Worse Performance: On the Perspective of Layer Utilization and Model Generalization

Generalization of deep neural networks remains one of the main open prob...
research
06/21/2017

The energy landscape of a simple neural network

We explore the energy landscape of a simple neural network. In particula...
research
06/11/2020

Deep Learning Requires Explicit Regularization for Reliable Predictive Probability

From the statistical learning perspective, complexity control via explic...
research
02/05/2018

Learning Compact Neural Networks with Regularization

We study the impact of regularization for learning neural networks. Our ...
research
07/14/2019

Learning Neural Networks with Adaptive Regularization

Feed-forward neural networks can be understood as a combination of an in...

Please sign up or login with your details

Forgot password? Click here to reset