Understanding deep learning via decision boundary

06/03/2022
by   Shiye Lei, et al.
0

This paper discovers that the neural network with lower decision boundary (DB) variability has better generalizability. Two new notions, algorithm DB variability and (ϵ, η)-data DB variability, are proposed to measure the decision boundary variability from the algorithm and data perspectives. Extensive experiments show significant negative correlations between the decision boundary variability and the generalizability. From the theoretical view, two lower bounds based on algorithm DB variability are proposed and do not explicitly depend on the sample size. We also prove an upper bound of order 𝒪(1/√(m)+ϵ+ηlog1/η) based on data DB variability. The bound is convenient to estimate without the requirement of labels, and does not explicitly depend on the network size which is usually prohibitively large in deep learning.

READ FULL TEXT
research
10/11/2022

Variability Matters : Evaluating inter-rater variability in histopathology for robust cell detection

Large annotated datasets have been a key component in the success of dee...
research
12/24/2019

Characterizing the Decision Boundary of Deep Neural Networks

Deep neural networks and in particular, deep neural classifiers have bec...
research
10/20/2021

Dynamic Bottleneck for Robust Self-Supervised Exploration

Exploration methods based on pseudo-count of transitions or curiosity of...
research
11/19/2019

Low Complexity Autoencoder based End-to-End Learning of Coded Communications Systems

End-to-end learning of a communications system using the deep learning-b...
research
03/11/2020

A variability measure for estimates of parameters in interval data fitting

The paper presents a construction of a quantitative measure of variabili...
research
10/24/2022

Understanding Inconsistency in Azure Cosmos DB with TLA+

Beyond implementation correctness of a distributed system, it is equally...

Please sign up or login with your details

Forgot password? Click here to reset