## 1 Theoretical guarantees

The Lagrangian dual of (1.2) is

(1.3) |

where is the dual variable, is the convex conjugate of at , and is the convex conjugate of at , i.e.,

(1.4) |

and

(1.5) |

The dual problem can be formulated as

(1.6) |

Assume the Slater’s condition holds, i.e., there exists such that , then the convexity of problem (0.1) implies that the optimal solution will achieve zero duality gap, i.e.,

(1.7) |

From KKT conditions, the optimal solution must satisfy (1.7) and

(1.8) |

Thus, the (1.7) and (1.8) can be used as optimality certificates or stopping criterion in algorithm design. More specifically, we define primal residual, dual residual, and duality gap with respect to a certain tuple as

(1.9) |

## 2 Algorithm design based on ADMM

We adopt ideas from alternating projection methods, and reformulate (1.1) as

(2.1) |

where is defined as

(2.2) |

The augmented Lagrangian of (2.2) becomes

(2.3) |

where is the dual variable, and is a parameter. Define

(2.4) |

and we get the iterations in ADMM are

(2.5) |

More specifically,

(2.6) |

or simply

(2.7) |

where is the proximator of function at which is defined as

(2.8) |

The and are defined as

(2.9) |

More specifically, the proximator of at is

(2.10) |

where is the elementwise soft thresholding function, i.e.,

(2.11) |

The proximator of at is

(2.12) |

The updating rule for can be specified as

(2.13) |

where is the projection of onto , i.e., the solution to

(2.14) |

Define

(2.15) |

and the updating rule for dual variable can be written as

(2.16) |

### 2.1 Analytic solution to (2.14)

Since (2.14) is convex, from the KKT conditions of (2.14), we know that are the optimal solution to (2.14) if and only if there exists such that

(2.17) |

which implies that the optimal can be obtained from solving the following linear system

(2.18) |

Remarks: (1) the matrix is highly sparse, and this structure can be combined with other potential structured of to simplify the computation; (2) even simple elimination can be used to simplify the problem, i.e.,

(2.19) |

or

(2.20) |

Both the two matrices and are positive definite, thus factorization techniques can be used to accelerate the computation; (3) since , the (2.19) will be more efficient; (4) apply Cholesky decomposition once to get ; (5) calculate once; (6) solve for backward, i.e., ;

### 2.2 Algorithm in pseudocodes

The algorithm can be summarized as in Algorithm 1.

Computational complexity - running time: (1) line 5, 7, and 8 takes ; (2) line 6 takes for Cholesky decomposition over , for once, for backward solving using (2.19); (3) line 9 and 10 takes . Thus, but only once in total;

Computational complexity - space or memory: ;

Baseline algorithm, CVX using interior point method: (1) but multiple times.

## 3 Numerical experiments

Computational environment: (1) desktop with Intel(R) Core(TM) i7-6700 CPU 3.40GHz 3.40 GHz, 32.0 GB RAM; (2) OS Windows 10 Education; (3) MATLAB R2018a; (4) baseline CVX which solves (0.1) using interior point method;

Computational setup: (1) is assumed to be sparse with cardinality , and generated randomly; (2) , and generate randomly; (3) generate noise randomly and normalize it to have magnitude ; (4) is assumed to be generated via ;

Time (sec) | 100 | 400 | 1600 | 6400 | 25600 |
---|---|---|---|---|---|

CVX | 0.7 | 1.16 | 44 | NA | NA |

Algorithm 1 | 0.01 | 0.02 | 0.31 | 4.82 | 104.91 |

## References

- [1] S. Boyd, N. Parikh, E. Chu, B. Peleato, and J. Eckstein. Distributed optimization and statistical learning via the alternating direction method of multipliers. Machine Learning, 3(1):1–122, 2010.
- [2] C. Fougner and S. Boyd. Parameter selection and pre-conditioning for a graph form solver. arXiv:1503.08366 [math], March 2015. arXiv: 1503.08366.
- [3] Neal Parikh and Stephen Boyd. Block splitting for distributed optimization. Mathematical Programming Computation, 6(1):77–102, 2014. Publisher: Springer.
- [4] N. Parikh and S. Boyd. Proximal algorithms. Foundations and Trends in Optimization, 1(3):123–231, 2013.

Comments

There are no comments yet.