A Knowledge Distillation framework for Multi-Organ Segmentation of Medaka Fish in Tomographic Image

02/24/2023
by   Jwalin Bhatt, et al.
0

Morphological atlases are an important tool in organismal studies, and modern high-throughput Computed Tomography (CT) facilities can produce hundreds of full-body high-resolution volumetric images of organisms. However, creating an atlas from these volumes requires accurate organ segmentation. In the last decade, machine learning approaches have achieved incredible results in image segmentation tasks, but they require large amounts of annotated data for training. In this paper, we propose a self-training framework for multi-organ segmentation in tomographic images of Medaka fish. We utilize the pseudo-labeled data from a pretrained Teacher model and adopt a Quality Classifier to refine the pseudo-labeled data. Then, we introduce a pixel-wise knowledge distillation method to prevent overfitting to the pseudo-labeled data and improve the segmentation performance. The experimental results demonstrate that our method improves mean Intersection over Union (IoU) by 5.9 dataset and enables keeping the quality while using three times less markup.

READ FULL TEXT

page 2

page 4

research
11/11/2022

Knowledge Distillation from Cross Teaching Teachers for Efficient Semi-Supervised Abdominal Organ Segmentation in CT

For more clinical applications of deep learning models for medical image...
research
12/21/2021

Teacher-Student Architecture for Mixed Supervised Lung Tumor Segmentation

Purpose: Automating tasks such as lung tumor localization and segmentati...
research
03/11/2019

Structured Knowledge Distillation for Semantic Segmentation

In this paper, we investigate the knowledge distillation strategy for tr...
research
06/03/2021

Noisy Labels are Treasure: Mean-Teacher-Assisted Confident Learning for Hepatic Vessel Segmentation

Manually segmenting the hepatic vessels from Computer Tomography (CT) is...
research
03/10/2021

Beyond Self-Supervision: A Simple Yet Effective Network Distillation Alternative to Improve Backbones

Recently, research efforts have been concentrated on revealing how pre-t...
research
11/24/2021

EvDistill: Asynchronous Events to End-task Learning via Bidirectional Reconstruction-guided Cross-modal Knowledge Distillation

Event cameras sense per-pixel intensity changes and produce asynchronous...
research
09/19/2023

Improving CLIP Robustness with Knowledge Distillation and Self-Training

This paper examines the robustness of a multi-modal computer vision mode...

Please sign up or login with your details

Forgot password? Click here to reset