Complexity of Block Coordinate Descent with Proximal Regularization and Applications to Wasserstein CP-dictionary Learning

06/04/2023
by   Dohyun Kwon, et al.
0

We consider the block coordinate descent methods of Gauss-Seidel type with proximal regularization (BCD-PR), which is a classical method of minimizing general nonconvex objectives under constraints that has a wide range of practical applications. We theoretically establish the worst-case complexity bound for this algorithm. Namely, we show that for general nonconvex smooth objectives with block-wise constraints, the classical BCD-PR algorithm converges to an epsilon-stationary point within O(1/epsilon) iterations. Under a mild condition, this result still holds even if the algorithm is executed inexactly in each step. As an application, we propose a provable and efficient algorithm for `Wasserstein CP-dictionary learning', which seeks a set of elementary probability distributions that can well-approximate a given set of d-dimensional joint probability distributions. Our algorithm is a version of BCD-PR that operates in the dual space, where the primal problem is regularized both entropically and proximally.

READ FULL TEXT
research
08/07/2019

Proximal Point Methods for Optimization with Nonconvex Functional Constraints

Nonconvex optimization is becoming more and more important in machine le...
research
06/25/2013

A Randomized Nonmonotone Block Proximal Gradient Method for a Class of Structured Nonlinear Programming

We propose a randomized nonmonotone block proximal gradient (RNBPG) meth...
research
06/14/2022

Supervised Dictionary Learning with Auxiliary Covariates

Supervised dictionary learning (SDL) is a classical machine learning met...
research
12/22/2022

Distributed Random Block-Coordinate descent methods for ill-posed composite convex optimisation problems

We develop a novel randomised block coordinate descent primal-dual algor...
research
09/03/2019

Efficiency of Coordinate Descent Methods For Structured Nonconvex Optimization

Novel coordinate descent (CD) methods are proposed for minimizing noncon...
research
03/29/2022

Convergence and Complexity of Stochastic Subgradient Methods with Dependent Data for Nonconvex Optimization

We show that under a general dependent data sampling scheme, the classic...
research
01/24/2019

On the Complexity of Approximating Wasserstein Barycenter

We study the complexity of approximating Wassertein barycenter of m disc...

Please sign up or login with your details

Forgot password? Click here to reset