Morphological Classification of Radio Galaxies using Semi-Supervised Group Equivariant CNNs

05/31/2023
by   Mir Sazzat Hossain, et al.
0

Out of the estimated few trillion galaxies, only around a million have been detected through radio frequencies, and only a tiny fraction, approximately a thousand, have been manually classified. We have addressed this disparity between labeled and unlabeled images of radio galaxies by employing a semi-supervised learning approach to classify them into the known Fanaroff-Riley Type I (FRI) and Type II (FRII) categories. A Group Equivariant Convolutional Neural Network (G-CNN) was used as an encoder of the state-of-the-art self-supervised methods SimCLR (A Simple Framework for Contrastive Learning of Visual Representations) and BYOL (Bootstrap Your Own Latent). The G-CNN preserves the equivariance for the Euclidean Group E(2), enabling it to effectively learn the representation of globally oriented feature maps. After representation learning, we trained a fully-connected classifier and fine-tuned the trained encoder with labeled data. Our findings demonstrate that our semi-supervised approach outperforms existing state-of-the-art methods across several metrics, including cluster quality, convergence rate, accuracy, precision, recall, and the F1-score. Moreover, statistical significance testing via a t-test revealed that our method surpasses the performance of a fully supervised G-CNN. This study emphasizes the importance of semi-supervised learning in radio galaxy classification, where labeled data are still scarce, but the prospects for discovery are immense.

READ FULL TEXT

page 1

page 2

page 4

page 6

research
03/29/2022

Self-Contrastive Learning based Semi-Supervised Radio Modulation Classification

This paper presents a semi-supervised learning framework that is new in ...
research
02/06/2019

Semi-supervised learning via Feedforward-Designed Convolutional Neural Networks

A semi-supervised learning framework using the feedforward-designed conv...
research
03/14/2022

S5CL: Unifying Fully-Supervised, Self-Supervised, and Semi-Supervised Learning Through Hierarchical Contrastive Learning

In computational pathology, we often face a scarcity of annotations and ...
research
06/29/2021

SCARF: Self-Supervised Contrastive Learning using Random Feature Corruption

Self-supervised contrastive representation learning has proved incredibl...
research
11/01/2016

Semi-Supervised Radio Signal Identification

Radio emitter recognition in dense multi-user environments is an importa...
research
10/23/2019

Semi-Supervised Histology Classification using Deep Multiple Instance Learning and Contrastive Predictive Coding

Convolutional neural networks can be trained to perform histology slide ...

Please sign up or login with your details

Forgot password? Click here to reset