Asynchronous Delay-Aware Accelerated Proximal Coordinate Descent for Nonconvex Nonsmooth Problems

02/05/2019
by   Ehsan Kazemi, et al.
0

Nonconvex and nonsmooth problems have recently attracted considerable attention in machine learning. However, developing efficient methods for the nonconvex and nonsmooth optimization problems with certain performance guarantee remains a challenge. Proximal coordinate descent (PCD) has been widely used for solving optimization problems, but the knowledge of PCD methods in the nonconvex setting is very limited. On the other hand, the asynchronous proximal coordinate descent (APCD) recently have received much attention in order to solve large-scale problems. However, the accelerated variants of APCD algorithms are rarely studied. In this paper, we extend APCD method to the accelerated algorithm (AAPCD) for nonsmooth and nonconvex problems that satisfies the sufficient descent property, by comparing between the function values at proximal update and a linear extrapolated point using a delay-aware momentum value. To the best of our knowledge, we are the first to provide stochastic and deterministic accelerated extension of APCD algorithms for general nonconvex and nonsmooth problems ensuring that for both bounded delays and unbounded delays every limit point is a critical point. By leveraging Kurdyka-Lojasiewicz property, we will show linear and sublinear convergence rates for the deterministic AAPCD with bounded delays. Numerical results demonstrate the practical efficiency of our algorithm in speed.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/15/2017

Accelerated Block Coordinate Proximal Gradients with Applications in High Dimensional Statistics

Nonconvex optimization problems arise in different research fields and a...
research
02/26/2020

Proximal Gradient Algorithm with Momentum and Flexible Parameter Restart for Nonconvex Optimization

Various types of parameter restart schemes have been proposed for accele...
research
02/27/2018

Accelerating Asynchronous Algorithms for Convex Optimization by Momentum Compensation

Asynchronous algorithms have attracted much attention recently due to th...
research
02/17/2022

Delay-adaptive step-sizes for asynchronous learning

In scalable machine learning systems, model training is often paralleliz...
research
08/23/2023

An Accelerated Block Proximal Framework with Adaptive Momentum for Nonconvex and Nonsmooth Optimization

We propose an accelerated block proximal linear framework with adaptive ...
research
02/09/2021

Proximal Gradient Descent-Ascent: Variable Convergence under KŁ Geometry

The gradient descent-ascent (GDA) algorithm has been widely applied to s...
research
07/24/2021

Distributed stochastic inertial methods with delayed derivatives

Stochastic gradient methods (SGMs) are predominant approaches for solvin...

Please sign up or login with your details

Forgot password? Click here to reset