Mini-Batch Stochastic ADMMs for Nonconvex Nonsmooth Optimization

02/08/2018
by   Feihu Huang, et al.
0

In the paper, we study the mini-batch stochastic ADMMs (alternating direction method of multipliers) for the nonconvex nonsmooth optimization. We prove that, given an appropriate mini-batch size, the mini-batch stochastic ADMM without variance reduction (VR) technique is convergent and reaches the convergence rate of O(1/T) to obtain a stationary point of the nonconvex optimization, where T denotes the number of iterations. Moreover, we extend the mini-batch stochastic gradient method to both the nonconvex SVRG-ADMM and SAGA-ADMM in our initial paper huang2016stochastic, and also prove that these mini-batch stochastic ADMMs reach the convergence rate of O(1/T) without the condition on the mini-batch size. In particular, we provide a specific parameter selection for step size η of stochastic gradients and penalization parameter ρ of the augmented Lagrangian function. Finally, some experimental results demonstrate the effectiveness of our algorithms.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/20/2017

Stochastic Recursive Gradient Algorithm for Nonconvex Optimization

In this paper, we study and analyze the mini-batch version of StochAstic...
research
07/21/2021

Differentiable Annealed Importance Sampling and the Perils of Gradient Noise

Annealed importance sampling (AIS) and related algorithms are highly eff...
research
10/02/2020

Coded Stochastic ADMM for Decentralized Consensus Optimization with Edge Computing

Big data, including applications with high security requirements, are of...
research
08/13/2020

Variance Regularization for Accelerating Stochastic Optimization

While nowadays most gradient-based optimization methods focus on explori...
research
05/05/2020

Dynamically Adjusting Transformer Batch Size by Monitoring Gradient Direction Change

The choice of hyper-parameters affects the performance of neural models....
research
11/05/2015

Stop Wasting My Gradients: Practical SVRG

We present and analyze several strategies for improving the performance ...
research
01/31/2019

Optimal mini-batch and step sizes for SAGA

Recently it has been shown that the step sizes of a family of variance r...

Please sign up or login with your details

Forgot password? Click here to reset