Deep Learning Requires Explicit Regularization for Reliable Predictive Probability

06/11/2020
by   Taejong Joo, et al.
0

From the statistical learning perspective, complexity control via explicit regularization is a necessity for improving the generalization of over-parameterized models, which deters the memorization of intricate patterns existing only in the training data. However, the impressive generalization performance of over-parameterized neural networks with only implicit regularization challenges this traditional role of explicit regularization. Furthermore, explicit regularization does not prevent neural networks from memorizing unnatural patterns, such as random labels, that cannot be generalized. In this work, we revisit the role and importance of explicit regularization methods for generalizing the predictive probability, not just the generalization of the 0-1 loss. Specifically, we present extensive empirical evidence showing the versatility of explicit regularization techniques on improving the reliability of the predictive probability, which enables better uncertainty representation and prevents the overconfidence problem. Our findings present a new direction to improve the predictive probability quality of deterministic neural networks, unlike the mainstream of approaches concentrates on building stochastic representation with Bayesian neural networks, ensemble methods, and hybrid models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/05/2019

Implicit Regularization in Over-parameterized Neural Networks

Over-parameterized neural networks generalize well in practice without a...
research
11/01/2018

Implicit Regularization of Stochastic Gradient Descent in Natural Language Processing: Observations and Implications

Deep neural networks with remarkably strong generalization performances ...
research
10/22/2021

The Equilibrium Hypothesis: Rethinking implicit regularization in Deep Neural Networks

Modern Deep Neural Networks (DNNs) exhibit impressive generalization pro...
research
10/02/2020

The Efficacy of L_1 Regularization in Two-Layer Neural Networks

A crucial problem in neural networks is to select the most appropriate n...
research
02/12/2020

Topologically Densified Distributions

We study regularization in the context of small sample-size learning wit...
research
06/06/2023

Bayesian post-hoc regularization of random forests

Random Forests are powerful ensemble learning algorithms widely used in ...
research
03/02/2021

Learning with Hyperspherical Uniformity

Due to the over-parameterization nature, neural networks are a powerful ...

Please sign up or login with your details

Forgot password? Click here to reset