On the convergence and sampling of randomized primal-dual algorithms and their application to parallel MRI reconstruction

07/25/2022
by   Eric B. Gutierrez, et al.
0

The Stochastic Primal-Dual Hybrid Gradient or SPDHG is an algorithm proposed by Chambolle et al. to efficiently solve a wide class of nonsmooth large-scale optimization problems. In this paper we contribute to its theoretical foundations and prove its almost sure convergence for convex but neither necessarily strongly convex nor smooth functionals, defined on Hilbert spaces of arbitrary dimension. We also prove its convergence for any arbitrary sampling, and for some specific samplings we propose theoretically optimal step size parameters which yield faster convergence. In addition, we propose using SPDHG for parallel Magnetic Resonance Imaging reconstruction, where data from different coils are randomly selected at each iteration. We apply SPDHG using a wide range of random sampling methods. We compare its performance across a range of settings, including mini-batch size, step size parameters, and both convex and strongly convex objective functionals. We show that the sampling can significantly affect the convergence speed of SPDHG. We conclude that for many cases an optimal sampling method can be identified.

READ FULL TEXT

page 14

page 16

page 18

page 19

page 22

research
12/02/2020

On the Convergence of the Stochastic Primal-Dual Hybrid Gradient for Convex Optimization

Stochastic Primal-Dual Hybrid Gradient (SPDHG) was proposed by Chambolle...
research
06/15/2017

Stochastic Primal-Dual Hybrid Gradient Algorithm with Arbitrary Sampling and Imaging Applications

We propose a stochastic extension of the primal-dual hybrid gradient alg...
research
06/05/2019

On the Convergence of SARAH and Beyond

The main theme of this work is a unifying algorithm, abbreviated as L2S,...
research
01/18/2018

Faster Algorithms for Large-scale Machine Learning using Simple Sampling Techniques

Now a days, the major challenge in machine learning is the `Big Data' ch...
research
07/13/2020

Random extrapolation for primal-dual coordinate descent

We introduce a randomly extrapolated primal-dual coordinate descent meth...
research
01/24/2019

SAGA with Arbitrary Sampling

We study the problem of minimizing the average of a very large number of...
research
08/30/2019

A Decomposition Method for Large-scale Convex Quadratically Constrained Quadratic Programs

We consider solving a convex quadratically constrained quadratic program...

Please sign up or login with your details

Forgot password? Click here to reset