DeepAI
Log In Sign Up

What does a deep neural network confidently perceive? The effective dimension of high certainty class manifolds and their low confidence boundaries

10/11/2022
by   Stanislav Fort, et al.
0

Deep neural network classifiers partition input space into high confidence regions for each class. The geometry of these class manifolds (CMs) is widely studied and intimately related to model performance; for example, the margin depends on CM boundaries. We exploit the notions of Gaussian width and Gordon's escape theorem to tractably estimate the effective dimension of CMs and their boundaries through tomographic intersections with random affine subspaces of varying dimension. We show several connections between the dimension of CMs, generalization, and robustness. In particular we investigate how CM dimension depends on 1) the dataset, 2) architecture (including ResNet, WideResNet & Vision Transformer), 3) initialization, 4) stage of training, 5) class, 6) network width, 7) ensemble size, 8) label randomization, 9) training set size, and 10) robustness to data corruption. Together a picture emerges that higher performing and more robust models have higher dimensional CMs. Moreover, we offer a new perspective on ensembling via intersections of CMs. Our code is at https://github.com/stanislavfort/slice-dice-optimize/

READ FULL TEXT
02/20/2020

On the Decision Boundaries of Deep Neural Networks: A Tropical Geometry Perspective

This work tackles the problem of characterizing and understanding the de...
11/30/2020

SplitNet: Divide and Co-training

The width of a neural network matters since increasing the width will ne...
06/06/2022

Spectral Bias Outside the Training Set for Deep Networks in the Kernel Regime

We provide quantitative bounds measuring the L^2 difference in function ...
05/19/2022

CLCNet: Rethinking of Ensemble Modeling with Classification Confidence Network

In this paper, we propose a Classification Confidence Network (CLCNet) t...
07/13/2021

How many degrees of freedom do we need to train deep networks: a loss landscape perspective

A variety of recent works, spanning pruning, lottery tickets, and traini...
07/15/2022

Algorithmic Determination of the Combinatorial Structure of the Linear Regions of ReLU Neural Networks

We algorithmically determine the regions and facets of all dimensions of...