Convergence rates for an inexact ADMM applied to separable convex optimization

01/06/2020
by   William W. Hager, et al.
0

Convergence rates are established for an inexact accelerated alternating direction method of multipliers (I-ADMM) for general separable convex optimization with a linear constraint. Both ergodic and non-ergodic iterates are analyzed. Relative to the iteration number k, the convergence rate is O(1/k) in a convex setting and O(1/k^2) in a strongly convex setting. When an error bound condition holds, the algorithm is 2-step linearly convergent. The I-ADMM is designed so that the accuracy of the inexact iteration preserves the global convergence rates of the exact iteration, leading to better performance in the test problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/11/2017

Accelerated Variance Reduced Stochastic ADMM

Recently, many variance reduced stochastic alternating direction method ...
research
03/31/2021

Iteration complexity analysis of a partial LQP-based alternating direction method of multipliers

In this paper, we consider a prototypical convex optimization problem wi...
research
07/12/2012

Distributed Strongly Convex Optimization

A lot of effort has been invested into characterizing the convergence ra...
research
08/10/2022

Moreau–Yosida regularization in DFT

Moreau-Yosida regularization is introduced into the framework of exact D...
research
06/11/2018

Convergence Rates for Projective Splitting

Projective splitting is a family of methods for solving inclusions invol...
research
01/21/2018

A Convex Optimization Framework for Constrained Concurrent Motion Control of a Hybrid Redundant Surgical System

We present a constrained motion control framework for a redundant surgic...
research
12/09/2020

Enhancing Parameter-Free Frank Wolfe with an Extra Subproblem

Aiming at convex optimization under structural constraints, this work in...

Please sign up or login with your details

Forgot password? Click here to reset