Learning with Hyperspherical Uniformity

03/02/2021
by   Weiyang Liu, et al.
4

Due to the over-parameterization nature, neural networks are a powerful tool for nonlinear function approximation. In order to achieve good generalization on unseen data, a suitable inductive bias is of great importance for neural networks. One of the most straightforward ways is to regularize the neural network with some additional objectives. L2 regularization serves as a standard regularization for neural networks. Despite its popularity, it essentially regularizes one dimension of the individual neuron, which is not strong enough to control the capacity of highly over-parameterized neural networks. Motivated by this, hyperspherical uniformity is proposed as a novel family of relational regularizations that impact the interaction among neurons. We consider several geometrically distinct ways to achieve hyperspherical uniformity. The effectiveness of hyperspherical uniformity is justified by theoretical insights and empirical evaluations.

READ FULL TEXT

page 1

page 2

page 6

page 7

page 9

04/09/2020

Orthogonal Over-Parameterized Training

The inductive bias of a neural network is largely determined by the arch...
10/02/2020

The Efficacy of L_1 Regularization in Two-Layer Neural Networks

A crucial problem in neural networks is to select the most appropriate n...
11/30/2021

Neuron with Steady Response Leads to Better Generalization

Regularization can mitigate the generalization gap between training and ...
05/23/2018

Learning towards Minimum Hyperspherical Energy

Neural networks are a powerful class of nonlinear functions that can be ...
03/13/2021

Conceptual capacity and effective complexity of neural networks

We propose a complexity measure of a neural network mapping function bas...
06/11/2020

Deep Learning Requires Explicit Regularization for Reliable Predictive Probability

From the statistical learning perspective, complexity control via explic...
05/29/2019

Learning the Non-linearity in Convolutional Neural Networks

We propose the introduction of nonlinear operation into the feature gene...