Physics-based network fine-tuning for robust quantitative susceptibility mapping from high-pass filtered phase

05/05/2023
by   Jinwei Zhang, et al.
1

Purpose: To improve the generalization ability of convolutional neural network (CNN) based prediction of quantitative susceptibility mapping (QSM) from high-pass filtered phase (HPFP) image. Methods: The proposed network addresses two common generalization issues that arise when using a pre-trained network to predict QSM from HPFP: a) data with unseen voxel sizes, and b) data with unknown high-pass filter parameters. A network fine-tuning step based on a high-pass filtering dipole convolution forward model is proposed to reduce the generalization error of the pre-trained network. A progressive Unet architecture is proposed to improve prediction accuracy without increasing fine-tuning computational cost. Results: In retrospective studies using RMSE, PSNR, SSIM and HFEN as quality metrics, the performance of both Unet and progressive Unet was improved after physics-based fine-tuning at all voxel sizes and most high-pass filtering cutoff frequencies tested in the experiment. Progressive Unet slightly outperformed Unet both before and after fine-tuning. In a prospective study, image sharpness was improved after physics-based fine-tuning for both Unet and progressive Unet. Compared to Unet, progressive Unet had better agreement of regional susceptibility values with reference QSM. Conclusion: The proposed method shows improved robustness compared to the pre-trained network without fine-tuning when the test dataset deviates from training. Our code is available at https://github.com/Jinwei1209/SWI_to_QSM/

READ FULL TEXT

page 16

page 17

page 19

page 21

page 23

page 25

research
10/17/2022

Scaling Shifting Your Features: A New Baseline for Efficient Model Tuning

Existing fine-tuning methods either tune all parameters of the pre-train...
research
04/11/2023

Towards Efficient Fine-tuning of Pre-trained Code Models: An Experimental Study and Beyond

Recently, fine-tuning pre-trained code models such as CodeBERT on downst...
research
05/02/2022

3D Convolutional Neural Networks for Dendrite Segmentation Using Fine-Tuning and Hyperparameter Optimization

Dendritic microstructures are ubiquitous in nature and are the primary s...
research
05/03/2022

Embedding Hallucination for Few-Shot Language Fine-tuning

Few-shot language learners adapt knowledge from a pre-trained model to r...
research
04/20/2022

Does Interference Exist When Training a Once-For-All Network?

The Once-For-All (OFA) method offers an excellent pathway to deploy a tr...
research
03/16/2023

Maximum margin learning of t-SPNs for cell classification with filtered input

An algorithm based on a deep probabilistic architecture referred to as a...
research
08/07/2019

Progressive Transfer Learning for Person Re-identification

Model fine-tuning is a widely used transfer learning approach in person ...

Please sign up or login with your details

Forgot password? Click here to reset