DeepAI AI Chat
Log In Sign Up

Variance-Reduced Proximal and Splitting Schemes for Monotone Stochastic Generalized Equations

by   Shisheng Cui, et al.

We consider monotone inclusion problems where the operators may be expectation-valued. A direct application of proximal and splitting schemes is complicated by resolving problems with expectation-valued maps at each step, a concern that is addressed by using sampling. Accordingly, we propose avenues for addressing uncertainty in the mapping. (i) Variance-reduced stochastic proximal point method (vr-SPP). We develop amongst the first variance-reduced stochastic proximal-point schemes that achieves deterministic rates of convergence in terms of solving proximal-point problems. In addition, it is shown that the schemes are characterized by either optimal or near-optimal oracle (or sample) complexity guarantees. Finally, the generated sequences are shown to be convergent to a solution in an almost-sure sense in both monotone and strongly monotone regimes; (ii) Variance-reduced stochastic modified forward-backward splitting scheme (vr-SMFBS). In constrained settings, we consider structured settings when the map can be decomposed into an expectation-valued map A and a maximal monotone map B with a tractable resolvent. Akin to (i), we show that the proposed schemes are equipped with a.s. convergence guarantees, linear (strongly monotone A) and 𝒪(1/k) (monotone A) rates of convergence while achieving optimal oracle complexity bounds. Of these, the rate statements in monotone regimes rely on leveraging the Fitzpatrick gap function for monotone inclusions. Furthermore, the schemes rely on weaker moment requirements on noise as well as allow for weakening unbiasedness requirements on oracles in strongly monotone regimes. Preliminary numerics reflect these findings and show that the variance-reduced schemes outperform stochastic approximation schemes, stochastic splitting and proximal point schemes, and sample-average approximation approaches.


page 1

page 2

page 3

page 4


A Tseng type stochastic forward-backward algorithm for monotone inclusions

In this paper, we propose a stochastic version of the classical Tseng's ...

Graph and distributed extensions of the Douglas-Rachford method

In this paper, we propose several graph-based extensions of the Douglas-...

A Unifying Framework for Variance Reduction Algorithms for Finding Zeroes of Monotone Operators

A wide range of optimization problems can be recast as monotone inclusio...

A Stochastic Proximal Point Algorithm for Saddle-Point Problems

We consider saddle point problems which objective functions are the aver...

Frugal Splitting Operators: Representation, Minimal Lifting and Convergence

We consider frugal splitting operators for finite sum monotone inclusion...

Convergence Rates for Projective Splitting

Projective splitting is a family of methods for solving inclusions invol...

Stochastic Projective Splitting: Solving Saddle-Point Problems with Multiple Regularizers

We present a new, stochastic variant of the projective splitting (PS) fa...