Multi-Stage Prediction Networks for Data Harmonization

07/26/2019
by   Stefano B. Blumberg, et al.
1

In this paper, we introduce multi-task learning (MTL) to data harmonization (DH); where we aim to harmonize images across different acquisition platforms and sites. This allows us to integrate information from multiple acquisitions and improve the predictive performance and learning efficiency of the harmonization model. Specifically, we introduce the Multi Stage Prediction (MSP) Network, a MTL framework that incorporates neural networks of potentially disparate architectures, trained for different individual acquisition platforms, into a larger architecture that is refined in unison. The MSP utilizes high-level features of single networks for individual tasks, as inputs of additional neural networks to inform the final prediction, therefore exploiting redundancy across tasks to make the most of limited training data. We validate our methods on a dMRI harmonization challenge dataset, where we predict three modern platform types, from one obtained from an old scanner. We show how MTL architectures, such as the MSP, produce around 20% improvement of patch-based mean-squared error over current state-of-the-art methods and that our MSP outperforms off-the-shelf MTL networks. Our code is available https://github.com/sbb-gh/ .

READ FULL TEXT

page 5

page 7

research
02/22/2022

Structured Multi-task Learning for Molecular Property Prediction

Multi-task learning for molecular property prediction is becoming increa...
research
06/23/2020

Clinical Risk Prediction with Temporal Probabilistic Asymmetric Multi-Task Learning

Although recent multi-task learning methods have shown to be effective i...
research
11/18/2019

Multi-Task Learning of Height and Semantics from Aerial Images

Aerial or satellite imagery is a great source for land surface analysis,...
research
08/12/2021

MT-ORL: Multi-Task Occlusion Relationship Learning

Retrieving occlusion relation among objects in a single image is challen...
research
04/05/2019

Branched Multi-Task Networks: Deciding What Layers To Share

In the context of deep learning, neural networks with multiple branches ...
research
03/04/2021

Lost in Pruning: The Effects of Pruning Neural Networks beyond Test Accuracy

Neural network pruning is a popular technique used to reduce the inferen...
research
01/13/2023

Analyzing and Improving the Pyramidal Predictive Network for Future Video Frame Prediction

The pyramidal predictive network (PPNV1) proposes an interesting tempora...

Please sign up or login with your details

Forgot password? Click here to reset