Student Collaboration Improves Self-Supervised Learning: Dual-Loss Adaptive Masked Autoencoder for Brain Cell Image Analysis

05/10/2022
by   Son T. Ly, et al.
0

Self-supervised learning leverages the underlying data structure as the source of the supervisory signal without the need for human annotation effort. This approach offers a practical solution to learning with a large amount of biomedical data and limited annotation. Unlike other studies exploiting data via multi-view (e.g., augmented images), this study presents a self-supervised Dual-Loss Adaptive Masked Autoencoder (DAMA) algorithm established from the viewpoint of the information theory. Specifically, our objective function maximizes the mutual information by minimizing the conditional entropy in pixel-level reconstruction and feature-level regression. We further introduce an adaptive mask sampling strategy to maximize mutual information. We conduct extensive experiments on brain cell images to validate the proposed method. DAMA significantly outperforms both state-of-the-art self-supervised and supervised methods on brain cells data and demonstrates competitive result on ImageNet-1k. Code: https://github.com/hula-ai/DAMA

READ FULL TEXT

page 2

page 15

page 22

research
02/27/2023

MPS-AMS: Masked Patches Selection and Adaptive Masking Strategy Based Self-Supervised Medical Image Segmentation

Existing self-supervised learning methods based on contrastive learning ...
research
07/20/2023

The Role of Entropy and Reconstruction in Multi-View Self-Supervised Learning

The mechanisms behind the success of multi-view self-supervised learning...
research
03/12/2021

Information Maximization Clustering via Multi-View Self-Labelling

Image clustering is a particularly challenging computer vision task, whi...
research
07/20/2021

ReSSL: Relational Self-Supervised Learning with Weak Augmentation

Self-supervised Learning (SSL) including the mainstream contrastive lear...
research
03/11/2023

PRSNet: A Masked Self-Supervised Learning Pedestrian Re-Identification Method

In recent years, self-supervised learning has attracted widespread acade...
research
07/06/2021

InfoNCE is a variational autoencoder

We show that a popular self-supervised learning method, InfoNCE, is a sp...
research
01/28/2023

Mutual Wasserstein Discrepancy Minimization for Sequential Recommendation

Self-supervised sequential recommendation significantly improves recomme...

Please sign up or login with your details

Forgot password? Click here to reset