Evaluating the fairness of fine-tuning strategies in self-supervised learning

10/01/2021
by   Jason Ramapuram, et al.
0

In this work we examine how fine-tuning impacts the fairness of contrastive Self-Supervised Learning (SSL) models. Our findings indicate that Batch Normalization (BN) statistics play a crucial role, and that updating only the BN statistics of a pre-trained SSL backbone improves its downstream fairness (36 supervised learning, while taking 4.4x less time to train and requiring only 0.35 supervised learning, we find that updating BN statistics and training residual skip connections (12.3 fine-tuned model, while taking 1.33x less time to train.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/12/2021

Estimating Galactic Distances From Images Using Self-supervised Representation Learning

We use a contrastive self-supervised learning framework to estimate dist...
research
10/28/2022

Elastic Weight Consolidation Improves the Robustness of Self-Supervised Learning Methods under Transfer

Self-supervised representation learning (SSL) methods provide an effecti...
research
08/05/2021

Self-Supervised Learning from Unlabeled Fundus Photographs Improves Segmentation of the Retina

Fundus photography is the primary method for retinal imaging and essenti...
research
10/22/2020

Contrastive Self-Supervised Learning for Wireless Power Control

We propose a new approach for power control in wireless networks using s...
research
09/02/2021

VIbCReg: Variance-Invariance-better-Covariance Regularization for Self-Supervised Learning on Time Series

Self-supervised learning for image representations has recently had many...
research
07/31/2023

Foundational Models for Fault Diagnosis of Electrical Motors

A majority of recent advancements related to the fault diagnosis of elec...
research
05/25/2023

Image is First-order Norm+Linear Autoregressive

This paper reveals that every image can be understood as a first-order n...

Please sign up or login with your details

Forgot password? Click here to reset