Stochastic Primal-Dual Three Operator Splitting with Arbitrary Sampling and Preconditioning

08/02/2022
by   Junqi Tang, et al.
0

In this work we propose a stochastic primal-dual preconditioned three-operator splitting algorithm for solving a class of convex three-composite optimization problems. Our proposed scheme is a direct three-operator splitting extension of the SPDHG algorithm [Chambolle et al. 2018]. We provide theoretical convergence analysis showing ergodic O(1/K) convergence rate, and demonstrate the effectiveness of our approach in imaging inverse problems.

READ FULL TEXT

page 7

page 8

research
12/26/2022

A stochastic preconditioned Douglas-Rachford splitting method for saddle-point problems

In this article, we propose and study a stochastic preconditioned Dougla...
research
11/14/2019

Primal-dual block-proximal splitting for a class of non-convex problems

We develop block structure adapted primal-dual algorithms for non-convex...
research
03/08/2022

Mini-batch stochastic three-operator splitting for distributed optimization

We consider a network of agents, each with its own private cost consisti...
research
02/21/2017

A Continuum of Optimal Primal-Dual Algorithms for Convex Composite Minimization Problems with Applications to Structured Sparsity

Many statistical learning problems can be posed as minimization of a sum...
research
06/15/2018

Primal-dual residual networks

In this work, we propose a deep neural network architecture motivated by...
research
02/08/2020

Predictive online optimisation with applications to optical flow

Online optimisation revolves around new data being introduced into a pro...
research
05/08/2022

Communication Compression for Decentralized Learning with Operator Splitting Methods

In decentralized learning, operator splitting methods using a primal-dua...

Please sign up or login with your details

Forgot password? Click here to reset