DeepAI AI Chat
Log In Sign Up

Fast Stochastic Variance Reduced ADMM for Stochastic Composition Optimization

05/11/2017
by   Yue Yu, et al.
Tsinghua University
0

We consider the stochastic composition optimization problem proposed in wang2017stochastic, which has applications ranging from estimation to statistical and machine learning. We propose the first ADMM-based algorithm named com-SVR-ADMM, and show that com-SVR-ADMM converges linearly for strongly convex and Lipschitz smooth objectives, and has a convergence rate of O( S/S), which improves upon the O(S^-4/9) rate in wang2016accelerating when the objective is convex and Lipschitz smooth. Moreover, com-SVR-ADMM possesses a rate of O(1/√(S)) when the objective is convex but without Lipschitz smoothness. We also conduct experiments and show that it outperforms existing algorithms.

READ FULL TEXT

page 1

page 2

page 3

page 4

07/11/2017

Accelerated Variance Reduced Stochastic ADMM

Recently, many variance reduced stochastic alternating direction method ...
04/24/2016

Stochastic Variance-Reduced ADMM

The alternating direction method of multipliers (ADMM) is a powerful opt...
11/02/2021

Faster Convex Lipschitz Regression via 2-block ADMM

The task of approximating an arbitrary convex function arises in several...
05/14/2019

Plug-and-Play Methods Provably Converge with Properly Trained Denoisers

Plug-and-play (PnP) is a non-convex framework that integrates modern den...
11/12/2019

Nonconvex Stochastic Nested Optimization via Stochastic ADMM

We consider the stochastic nested composition optimization problem where...
07/10/2018

Dual optimization for convex constrained objectives without the gradient-Lipschitz assumption

The minimization of convex objectives coming from linear supervised lear...
11/24/2020

A New Algorithm for Convex Biclustering and Its Extension to the Compositional Data

Biclustering is a powerful data mining technique that allows simultaneou...