Self-Supervised Gaussian Regularization of Deep Classifiers for Mahalanobis-Distance-Based Uncertainty Estimation

05/23/2023
by   Aishwarya Venkataramanan, et al.
0

Recent works show that the data distribution in a network's latent space is useful for estimating classification uncertainty and detecting Out-of-distribution (OOD) samples. To obtain a well-regularized latent space that is conducive for uncertainty estimation, existing methods bring in significant changes to model architectures and training procedures. In this paper, we present a lightweight, fast, and high-performance regularization method for Mahalanobis distance-based uncertainty prediction, and that requires minimal changes to the network's architecture. To derive Gaussian latent representation favourable for Mahalanobis Distance calculation, we introduce a self-supervised representation learning method that separates in-class representations into multiple Gaussians. Classes with non-Gaussian representations are automatically identified and dynamically clustered into multiple new classes that are approximately Gaussian. Evaluation on standard OOD benchmarks shows that our method achieves state-of-the-art results on OOD detection with minimal inference time, and is very competitive on predictive probability calibration. Finally, we show the applicability of our method to a real-life computer vision use case on microorganism classification.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/04/2021

Graph Barlow Twins: A self-supervised representation learning framework for graphs

The self-supervised learning (SSL) paradigm is an essential exploration ...
research
10/07/2020

Representation Learning for Sequence Data with Deep Autoencoding Predictive Components

We propose Deep Autoencoding Predictive Components (DAPC) – a self-super...
research
08/28/2023

Diversified Ensemble of Independent Sub-Networks for Robust Self-Supervised Representation Learning

Ensembling a neural network is a widely recognized approach to enhance m...
research
08/27/2019

Self-Supervised Representation Learning via Neighborhood-Relational Encoding

In this paper, we propose a novel self-supervised representation learnin...
research
07/27/2022

Deep Clustering with Features from Self-Supervised Pretraining

A deep clustering model conceptually consists of a feature extractor tha...
research
03/31/2020

Distance in Latent Space as Novelty Measure

Deep Learning performs well when training data densely covers the experi...
research
02/06/2020

Neural Network Representation Control: Gaussian Isolation Machines and CVC Regularization

In many cases, neural network classifiers are likely to be exposed to in...

Please sign up or login with your details

Forgot password? Click here to reset