Diversified Ensemble of Independent Sub-Networks for Robust Self-Supervised Representation Learning

08/28/2023
by   Amirhossein Vahidi, et al.
0

Ensembling a neural network is a widely recognized approach to enhance model performance, estimate uncertainty, and improve robustness in deep supervised learning. However, deep ensembles often come with high computational costs and memory demands. In addition, the efficiency of a deep ensemble is related to diversity among the ensemble members which is challenging for large, over-parameterized deep neural networks. Moreover, ensemble learning has not yet seen such widespread adoption, and it remains a challenging endeavor for self-supervised or unsupervised representation learning. Motivated by these challenges, we present a novel self-supervised training regime that leverages an ensemble of independent sub-networks, complemented by a new loss function designed to encourage diversity. Our method efficiently builds a sub-model ensemble with high diversity, leading to well-calibrated estimates of model uncertainty, all achieved with minimal computational overhead compared to traditional deep self-supervised ensembles. To evaluate the effectiveness of our approach, we conducted extensive experiments across various tasks, including in-distribution generalization, out-of-distribution detection, dataset corruption, and semi-supervised settings. The results demonstrate that our method significantly improves prediction reliability. Our approach not only achieves excellent accuracy but also enhances calibration, surpassing baseline performance across a wide range of self-supervised architectures in computer vision, natural language processing, and genomics data.

READ FULL TEXT

page 7

page 12

research
06/04/2021

Graph Barlow Twins: A self-supervised representation learning framework for graphs

The self-supervised learning (SSL) paradigm is an essential exploration ...
research
06/10/2023

ECGBERT: Understanding Hidden Language of ECGs with Self-Supervised Representation Learning

In the medical field, current ECG signal analysis approaches rely on sup...
research
01/25/2019

Revisiting Self-Supervised Visual Representation Learning

Unsupervised visual representation learning remains a largely unsolved p...
research
05/23/2023

Self-Supervised Gaussian Regularization of Deep Classifiers for Mahalanobis-Distance-Based Uncertainty Estimation

Recent works show that the data distribution in a network's latent space...
research
12/26/2021

Efficient Diversity-Driven Ensemble for Deep Neural Networks

The ensemble of deep neural networks has been shown, both theoretically ...
research
02/22/2018

Diversity regularization in deep ensembles

Calibrating the confidence of supervised learning models is important fo...
research
04/19/2022

G2GT: Retrosynthesis Prediction with Graph to Graph Attention Neural Network and Self-Training

Retrosynthesis prediction is one of the fundamental challenges in organi...

Please sign up or login with your details

Forgot password? Click here to reset