RepMode: Learning to Re-parameterize Diverse Experts for Subcellular Structure Prediction

12/20/2022
by   Donghao Zhou, et al.
0

In subcellular biological research, fluorescence staining is a key technique to reveal the locations and morphology of subcellular structures. However, fluorescence staining is slow, expensive, and harmful to cells. In this paper, we treat it as a deep learning task termed subcellular structure prediction (SSP), aiming to predict the 3D fluorescent images of multiple subcellular structures from a 3D transmitted-light image. Unfortunately, due to the limitations of current biotechnology, each image is partially labeled in SSP. Besides, naturally, the subcellular structures vary considerably in size, which causes the multi-scale issue in SSP. However, traditional solutions can not address SSP well since they organize network parameters inefficiently and inflexibly. To overcome these challenges, we propose Re-parameterizing Mixture-of-Diverse-Experts (RepMode), a network that dynamically organizes its parameters with task-aware priors to handle specified single-label prediction tasks of SSP. In RepMode, the Mixture-of-Diverse-Experts (MoDE) block is designed to learn the generalized parameters for all tasks, and gating re-parameterization (GatRep) is performed to generate the specialized parameters for each task, by which RepMode can maintain a compact practical topology exactly like a plain network, and meanwhile achieves a powerful theoretical topology. Comprehensive experiments show that RepMode outperforms existing methods on ten of twelve prediction tasks of SSP and achieves state-of-the-art overall performance.

READ FULL TEXT

page 1

page 8

page 15

page 16

research
05/25/2022

Eliciting Transferability in Multi-task Learning with Task-level Mixture-of-Experts

Recent work suggests that transformer models are capable of multi-task l...
research
04/16/2022

Sparsely Activated Mixture-of-Experts are Robust Multi-Task Learners

Traditional multi-task learning (MTL) methods use dense networks that us...
research
12/15/2022

Mod-Squad: Designing Mixture of Experts As Modular Multi-Task Learners

Optimization in multi-task learning (MTL) is more challenging than singl...
research
05/25/2023

Modeling Task Relationships in Multi-variate Soft Sensor with Balanced Mixture-of-Experts

Accurate estimation of multiple quality variables is critical for buildi...
research
09/30/2020

Restoring Spatially-Heterogeneous Distortions using Mixture of Experts Network

In recent years, deep learning-based methods have been successfully appl...
research
03/24/2023

MoWE: Mixture of Weather Experts for Multiple Adverse Weather Removal

Currently, most adverse weather removal tasks are handled independently,...
research
01/18/2020

Efficient Neural Architecture Search: A Broad Version

Efficient Neural Architecture Search (ENAS) achieves novel efficiency fo...

Please sign up or login with your details

Forgot password? Click here to reset