Block stochastic gradient descent for large-scale tomographic reconstruction in a parallel network

03/28/2019
by   Yushan Gao, et al.
0

Iterative algorithms have many advantages for linear tomographic image reconstruction when compared to back-projection based methods. However, iterative methods tend to have significantly higher computational complexity. To overcome this, parallel processing schemes that can utilise several computing nodes are desirable. Popular methods here are row action methods, which update the entire image simultaneously and column action methods, which require access to all measurements at each node. In large scale tomographic reconstruction with limited storage capacity of each node, data communication overheads between nodes becomes a significant performance limiting factor. To reduce this overhead, we proposed a row action method BSGD. The method is based on the stochastic gradient descent method but it does not update the entire image at each iteration, which reduces between node communication. To further increase convergence speeds, an importance sampling strategy is proposed. We compare BSGD to other existing stochastic methods and show its effectiveness and efficiency. Other properties of BSGD are also explored, including its ability to incorporate total variation (TV) regularization and automatic parameter tuning.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset