Loss Function Entropy Regularization for Diverse Decision Boundaries

04/30/2022
by   Chong Sue Sin, et al.
0

Is it possible to train several classifiers to perform meaningful crowd-sourcing to produce a better prediction label set without any ground-truth annotation? In this paper, we will attempt to modify the contrastive learning objectives to automatically train a self-complementing ensemble to produce a state-of-the-art prediction on the CIFAR10 and CIFAR100-20 task. This paper will present a remarkably simple method to modify a single unsupervised classification pipeline to automatically generate an ensemble of neural networks with varied decision boundaries to learn a larger feature set of classes. Loss Function Entropy Regularization (LFER), are regularization terms to be added upon the pre-training and contrastive learning objective functions, gives us a gear to modify the entropy state of the output space of unsupervised learning, thereby diversifying the latent representation of decision boundaries of neural networks. Ensemble trained with LFER have higher successful prediction accuracy for samples near decision boundaries. LFER is a effective gear to perturb decision boundaries, and has proven to be able to produce classifiers that beat state-of-the-art at contrastive learning stage. Experiments show that LFER can produce an ensemble where each have accuracy comparable to the state-of-the-art, yet have each have varied latent decision boundaries. It allows us to essence perform meaningful verification for samples near decision boundaries, encouraging correct classification of near-boundary samples. By compounding the probability of correct prediction of a single sample amongst an ensemble of neural network trained, our method is able to improve upon a single classifier by denoising and affirming correct feature mappings.

READ FULL TEXT

page 1

page 4

page 6

research
12/24/2019

Characterizing the Decision Boundary of Deep Neural Networks

Deep neural networks and in particular, deep neural classifiers have bec...
research
08/14/2023

Contrastive Bi-Projector for Unsupervised Domain Adaption

This paper proposes a novel unsupervised domain adaption (UDA) method ba...
research
01/26/2023

Neural networks learn to magnify areas near decision boundaries

We study how training molds the Riemannian geometry induced by neural ne...
research
08/03/2021

Improving Music Performance Assessment with Contrastive Learning

Several automatic approaches for objective music performance assessment ...
research
08/31/2023

Supervised Contrastive Learning with Nearest Neighbor Search for Speech Emotion Recognition

Speech Emotion Recognition (SER) is a challenging task due to limited da...
research
05/28/2019

Controlling Neural Level Sets

The level sets of neural networks represent fundamental properties such ...
research
11/15/2016

Diversity encouraged learning of unsupervised LSTM ensemble for neural activity video prediction

Being able to predict the neural signal in the near future from the curr...

Please sign up or login with your details

Forgot password? Click here to reset