Motion Artifact Reduction in Quantitative Susceptibility Mapping using Deep Neural Network
An approach to reduce motion artifacts in Quantitative Susceptibility Mapping using deep learning is proposed. We use an affine motion model with randomly created motion profiles to simulate motion-corrupted QSM images. The simulated QSM image is paired with its motion-free reference to train a neural network using supervised learning. The trained network is tested on unseen simulated motion-corrupted QSM images, in healthy volunteers and in Parkinson's disease patients. The results show that motion artifacts, such as ringing and ghosting, were successfully suppressed.
READ FULL TEXT