A Constant Step Stochastic Douglas-Rachford Algorithm with Application to Non Separable Regularizations

04/03/2018
by   Adil Salim, et al.
0

The Douglas Rachford algorithm is an algorithm that converges to a minimizer of a sum of two convex functions. The algorithm consists in fixed point iterations involving computations of the proximity operators of the two functions separately. The paper investigates a stochastic version of the algorithm where both functions are random and the step size is constant. We establish that the iterates of the algorithm stay close to the set of solution with high probability when the step size is small enough. Application to structured regularization is considered.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/12/2017

Linear Stochastic Approximation: Constant Step-Size and Iterate Averaging

We consider d-dimensional linear stochastic approximation algorithms (LS...
research
05/18/2020

Convergence of constant step stochastic gradient descent for non-smooth non-convex functions

This paper studies the asymptotic behavior of the constant step Stochast...
research
01/22/2020

Chirotopes of Random Points in Space are Realizable on a Small Integer Grid

We prove that with high probability, a uniform sample of n points in a c...
research
09/15/2018

Completely Uncoupled Algorithms for Network Utility Maximization

In this paper, we present two completely uncoupled algorithms for utilit...
research
08/17/2022

Dynamical softassign and adaptive parameter tuning for graph matching

This paper studies a framework, projected fixed-point method, for graph ...
research
09/18/2019

Fixed Point Analysis of Douglas-Rachford Splitting for Ptychography and Phase Retrieval

Douglas-Rachford Splitting (DRS) methods based on the proximal point alg...
research
08/19/2023

Dynamic Bilevel Learning with Inexact Line Search

In various domains within imaging and data science, particularly when ad...

Please sign up or login with your details

Forgot password? Click here to reset